go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-api\-machinery\]\sServers\swith\ssupport\sfor\sAPI\schunking\sshould\ssupport\scontinue\slisting\sfrom\sthe\slast\skey\sif\sthe\soriginal\sversion\shas\sbeen\scompacted\saway\,\sthough\sthe\slist\sis\sinconsistent\s\[Slow\]$'
test/e2e/apimachinery/chunking.go:177 k8s.io/kubernetes/test/e2e/apimachinery.glob..func4.3() test/e2e/apimachinery/chunking.go:177 +0x7fc There were additional failures detected after the initial failure: [FAILED] Nov 26 18:49:14.127: failed to list events in namespace "chunking-8980": Get "https://34.83.88.61/api/v1/namespaces/chunking-8980/events": dial tcp 34.83.88.61:443: connect: connection refused In [DeferCleanup (Each)] at: test/e2e/framework/debug/dump.go:44 ---------- [FAILED] Nov 26 18:49:14.168: Couldn't delete ns: "chunking-8980": Delete "https://34.83.88.61/api/v1/namespaces/chunking-8980": dial tcp 34.83.88.61:443: connect: connection refused (&url.Error{Op:"Delete", URL:"https://34.83.88.61/api/v1/namespaces/chunking-8980", Err:(*net.OpError)(0xc0038bfb80)}) In [DeferCleanup (Each)] at: test/e2e/framework/framework.go:370from junit_01.xml
[BeforeEach] [sig-api-machinery] Servers with support for API chunking set up framework | framework.go:178 STEP: Creating a kubernetes client 11/26/22 18:44:15.836 Nov 26 18:44:15.836: INFO: >>> kubeConfig: /workspace/.kube/config STEP: Building a namespace api object, basename chunking 11/26/22 18:44:15.838 STEP: Waiting for a default service account to be provisioned in namespace 11/26/22 18:44:16.126 STEP: Waiting for kube-root-ca.crt to be provisioned in namespace 11/26/22 18:44:16.253 [BeforeEach] [sig-api-machinery] Servers with support for API chunking test/e2e/framework/metrics/init/init.go:31 [BeforeEach] [sig-api-machinery] Servers with support for API chunking test/e2e/apimachinery/chunking.go:51 STEP: creating a large number of resources 11/26/22 18:44:16.344 [It] should support continue listing from the last key if the original version has been compacted away, though the list is inconsistent [Slow] test/e2e/apimachinery/chunking.go:126 STEP: retrieving the first page 11/26/22 18:44:33.903 Nov 26 18:44:33.968: INFO: Retrieved 40/40 results with rv 3657 and continue eyJ2IjoibWV0YS5rOHMuaW8vdjEiLCJydiI6MzY1Nywic3RhcnQiOiJ0ZW1wbGF0ZS0wMDM5XHUwMDAwIn0 STEP: retrieving the second page until the token expires 11/26/22 18:44:33.968 Nov 26 18:44:54.034: INFO: Token eyJ2IjoibWV0YS5rOHMuaW8vdjEiLCJydiI6MzY1Nywic3RhcnQiOiJ0ZW1wbGF0ZS0wMDM5XHUwMDAwIn0 has not expired yet Nov 26 18:45:14.062: INFO: Token eyJ2IjoibWV0YS5rOHMuaW8vdjEiLCJydiI6MzY1Nywic3RhcnQiOiJ0ZW1wbGF0ZS0wMDM5XHUwMDAwIn0 has not expired yet Nov 26 18:45:34.153: INFO: Token eyJ2IjoibWV0YS5rOHMuaW8vdjEiLCJydiI6MzY1Nywic3RhcnQiOiJ0ZW1wbGF0ZS0wMDM5XHUwMDAwIn0 has not expired yet Nov 26 18:45:54.079: INFO: Token eyJ2IjoibWV0YS5rOHMuaW8vdjEiLCJydiI6MzY1Nywic3RhcnQiOiJ0ZW1wbGF0ZS0wMDM5XHUwMDAwIn0 has not expired yet Nov 26 18:46:14.033: INFO: Token eyJ2IjoibWV0YS5rOHMuaW8vdjEiLCJydiI6MzY1Nywic3RhcnQiOiJ0ZW1wbGF0ZS0wMDM5XHUwMDAwIn0 has not expired yet Nov 26 18:46:34.047: INFO: Token eyJ2IjoibWV0YS5rOHMuaW8vdjEiLCJydiI6MzY1Nywic3RhcnQiOiJ0ZW1wbGF0ZS0wMDM5XHUwMDAwIn0 has not expired yet Nov 26 18:46:54.019: INFO: Token eyJ2IjoibWV0YS5rOHMuaW8vdjEiLCJydiI6MzY1Nywic3RhcnQiOiJ0ZW1wbGF0ZS0wMDM5XHUwMDAwIn0 has not expired yet Nov 26 18:47:14.014: INFO: Token eyJ2IjoibWV0YS5rOHMuaW8vdjEiLCJydiI6MzY1Nywic3RhcnQiOiJ0ZW1wbGF0ZS0wMDM5XHUwMDAwIn0 has not expired yet Nov 26 18:47:34.014: INFO: Token eyJ2IjoibWV0YS5rOHMuaW8vdjEiLCJydiI6MzY1Nywic3RhcnQiOiJ0ZW1wbGF0ZS0wMDM5XHUwMDAwIn0 has not expired yet Nov 26 18:47:54.012: INFO: Token eyJ2IjoibWV0YS5rOHMuaW8vdjEiLCJydiI6MzY1Nywic3RhcnQiOiJ0ZW1wbGF0ZS0wMDM5XHUwMDAwIn0 has not expired yet Nov 26 18:48:14.012: INFO: Token eyJ2IjoibWV0YS5rOHMuaW8vdjEiLCJydiI6MzY1Nywic3RhcnQiOiJ0ZW1wbGF0ZS0wMDM5XHUwMDAwIn0 has not expired yet Nov 26 18:48:34.012: INFO: Token eyJ2IjoibWV0YS5rOHMuaW8vdjEiLCJydiI6MzY1Nywic3RhcnQiOiJ0ZW1wbGF0ZS0wMDM5XHUwMDAwIn0 has not expired yet Nov 26 18:48:54.014: INFO: Token eyJ2IjoibWV0YS5rOHMuaW8vdjEiLCJydiI6MzY1Nywic3RhcnQiOiJ0ZW1wbGF0ZS0wMDM5XHUwMDAwIn0 has not expired yet STEP: retrieving the second page again with the token received with the error message 11/26/22 18:49:14.008 Nov 26 18:49:14.047: INFO: Unexpected error: failed to list pod templates in namespace: chunking-8980, given inconsistent continue token and limit: 40: <*url.Error | 0xc00436e510>: { Op: "Get", URL: "https://34.83.88.61/api/v1/namespaces/chunking-8980/podtemplates?limit=40", Err: <*net.OpError | 0xc0038bf950>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc000f7ad80>{ IP: [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 255, 255, 34, 83, 88, 61], Port: 443, Zone: "", }, Err: <*os.SyscallError | 0xc004124400>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Nov 26 18:49:14.047: FAIL: failed to list pod templates in namespace: chunking-8980, given inconsistent continue token and limit: 40: Get "https://34.83.88.61/api/v1/namespaces/chunking-8980/podtemplates?limit=40": dial tcp 34.83.88.61:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/apimachinery.glob..func4.3() test/e2e/apimachinery/chunking.go:177 +0x7fc [AfterEach] [sig-api-machinery] Servers with support for API chunking test/e2e/framework/node/init/init.go:32 Nov 26 18:49:14.048: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready [DeferCleanup (Each)] [sig-api-machinery] Servers with support for API chunking test/e2e/framework/metrics/init/init.go:33 [DeferCleanup (Each)] [sig-api-machinery] Servers with support for API chunking dump namespaces | framework.go:196 STEP: dump namespace information after failure 11/26/22 18:49:14.087 STEP: Collecting events from namespace "chunking-8980". 11/26/22 18:49:14.087 Nov 26 18:49:14.127: INFO: Unexpected error: failed to list events in namespace "chunking-8980": <*url.Error | 0xc000f7adb0>: { Op: "Get", URL: "https://34.83.88.61/api/v1/namespaces/chunking-8980/events", Err: <*net.OpError | 0xc0050a4000>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc002a58540>{ IP: [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 255, 255, 34, 83, 88, 61], Port: 443, Zone: "", }, Err: <*os.SyscallError | 0xc00114eee0>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Nov 26 18:49:14.127: FAIL: failed to list events in namespace "chunking-8980": Get "https://34.83.88.61/api/v1/namespaces/chunking-8980/events": dial tcp 34.83.88.61:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/framework/debug.dumpEventsInNamespace(0xc003b845c0, {0xc002ab81b0, 0xd}) test/e2e/framework/debug/dump.go:44 +0x191 k8s.io/kubernetes/test/e2e/framework/debug.DumpAllNamespaceInfo({0x801de88, 0xc000b9c4e0}, {0xc002ab81b0, 0xd}) test/e2e/framework/debug/dump.go:62 +0x8d k8s.io/kubernetes/test/e2e/framework/debug/init.init.0.func1.1(0xc003b84650?, {0xc002ab81b0?, 0x7fa7740?}) test/e2e/framework/debug/init/init.go:34 +0x32 k8s.io/kubernetes/test/e2e/framework.(*Framework).dumpNamespaceInfo.func1() test/e2e/framework/framework.go:274 +0x6d k8s.io/kubernetes/test/e2e/framework.(*Framework).dumpNamespaceInfo(0xc00108a780) test/e2e/framework/framework.go:271 +0x179 reflect.Value.call({0x6627cc0?, 0xc002934870?, 0xc004bdafb0?}, {0x75b6e72, 0x4}, {0xae73300, 0x0, 0xc00012ed88?}) /usr/local/go/src/reflect/value.go:584 +0x8c5 reflect.Value.Call({0x6627cc0?, 0xc002934870?, 0x29449fc?}, {0xae73300?, 0xc004bdaf80?, 0x2a6d786?}) /usr/local/go/src/reflect/value.go:368 +0xbc [DeferCleanup (Each)] [sig-api-machinery] Servers with support for API chunking tear down framework | framework.go:193 STEP: Destroying namespace "chunking-8980" for this suite. 11/26/22 18:49:14.128 Nov 26 18:49:14.168: FAIL: Couldn't delete ns: "chunking-8980": Delete "https://34.83.88.61/api/v1/namespaces/chunking-8980": dial tcp 34.83.88.61:443: connect: connection refused (&url.Error{Op:"Delete", URL:"https://34.83.88.61/api/v1/namespaces/chunking-8980", Err:(*net.OpError)(0xc0038bfb80)}) Full Stack Trace k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach.func1() test/e2e/framework/framework.go:370 +0x4fe k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach(0xc00108a780) test/e2e/framework/framework.go:383 +0x1ca reflect.Value.call({0x6627cc0?, 0xc0029347f0?, 0xc004060fb0?}, {0x75b6e72, 0x4}, {0xae73300, 0x0, 0x0?}) /usr/local/go/src/reflect/value.go:584 +0x8c5 reflect.Value.Call({0x6627cc0?, 0xc0029347f0?, 0x0?}, {0xae73300?, 0x5?, 0xc00349ec40?}) /usr/local/go/src/reflect/value.go:368 +0xbc
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-apps\]\sStatefulSet\sBasic\sStatefulSet\sfunctionality\s\[StatefulSetBasic\]\sBurst\sscaling\sshould\srun\sto\scompletion\seven\swith\sunhealthy\spods\s\[Slow\]\s\[Conformance\]$'
test/e2e/framework/statefulset/rest.go:69 k8s.io/kubernetes/test/e2e/framework/statefulset.GetPodList({0x801de88, 0xc003920340}, 0xc0008b3400) test/e2e/framework/statefulset/rest.go:69 +0x153 k8s.io/kubernetes/test/e2e/framework/statefulset.WaitForRunning.func1() test/e2e/framework/statefulset/wait.go:37 +0x4a k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.ConditionFunc.WithContext.func1({0x2742871, 0x0}) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:222 +0x1b k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.runConditionWithCrashProtectionWithContext({0x7fe0bc8?, 0xc0000820c8?}, 0x262a61f?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:235 +0x57 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0000820c8}, 0xc004594048, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:662 +0x10c k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0000820c8}, 0xf8?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 +0x9a k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0000820c8}, 0x65cbc00?, 0xc00424de48?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 +0x4a k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0x2c3?, 0x0?, 0x0?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 +0x50 k8s.io/kubernetes/test/e2e/framework/statefulset.WaitForRunning({0x801de88?, 0xc003920340}, 0x1, 0x1, 0xc0008b3400) test/e2e/framework/statefulset/wait.go:35 +0xbd k8s.io/kubernetes/test/e2e/framework/statefulset.WaitForRunningAndReady(...) test/e2e/framework/statefulset/wait.go:80 k8s.io/kubernetes/test/e2e/apps.glob..func10.2.11() test/e2e/apps/statefulset.go:708 +0x27b There were additional failures detected after the initial failure: [FAILED] Nov 26 18:52:54.994: Get "https://34.83.88.61/apis/apps/v1/namespaces/statefulset-7365/statefulsets": dial tcp 34.83.88.61:443: connect: connection refused In [AfterEach] at: test/e2e/framework/statefulset/rest.go:76 ---------- [FAILED] Nov 26 18:52:55.073: failed to list events in namespace "statefulset-7365": Get "https://34.83.88.61/api/v1/namespaces/statefulset-7365/events": dial tcp 34.83.88.61:443: connect: connection refused In [DeferCleanup (Each)] at: test/e2e/framework/debug/dump.go:44 ---------- [FAILED] Nov 26 18:52:55.117: Couldn't delete ns: "statefulset-7365": Delete "https://34.83.88.61/api/v1/namespaces/statefulset-7365": dial tcp 34.83.88.61:443: connect: connection refused (&url.Error{Op:"Delete", URL:"https://34.83.88.61/api/v1/namespaces/statefulset-7365", Err:(*net.OpError)(0xc004c9b630)}) In [DeferCleanup (Each)] at: test/e2e/framework/framework.go:370from junit_01.xml
[BeforeEach] [sig-apps] StatefulSet set up framework | framework.go:178 STEP: Creating a kubernetes client 11/26/22 18:52:44.228 Nov 26 18:52:44.228: INFO: >>> kubeConfig: /workspace/.kube/config STEP: Building a namespace api object, basename statefulset 11/26/22 18:52:44.229 STEP: Waiting for a default service account to be provisioned in namespace 11/26/22 18:52:44.41 STEP: Waiting for kube-root-ca.crt to be provisioned in namespace 11/26/22 18:52:44.508 [BeforeEach] [sig-apps] StatefulSet test/e2e/framework/metrics/init/init.go:31 [BeforeEach] [sig-apps] StatefulSet test/e2e/apps/statefulset.go:98 [BeforeEach] Basic StatefulSet functionality [StatefulSetBasic] test/e2e/apps/statefulset.go:113 STEP: Creating service test in namespace statefulset-7365 11/26/22 18:52:44.604 [It] Burst scaling should run to completion even with unhealthy pods [Slow] [Conformance] test/e2e/apps/statefulset.go:697 STEP: Creating stateful set ss in namespace statefulset-7365 11/26/22 18:52:44.687 STEP: Waiting until all stateful set ss replicas will be running in namespace statefulset-7365 11/26/22 18:52:44.762 Nov 26 18:52:44.871: INFO: Found 0 stateful pods, waiting for 1 Nov 26 18:52:54.912: INFO: Unexpected error: <*url.Error | 0xc0040bb3e0>: { Op: "Get", URL: "https://34.83.88.61/api/v1/namespaces/statefulset-7365/pods?labelSelector=baz%3Dblah%2Cfoo%3Dbar", Err: <*net.OpError | 0xc0038af770>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc0040bb3b0>{ IP: [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 255, 255, 34, 83, 88, 61], Port: 443, Zone: "", }, Err: <*os.SyscallError | 0xc0045c0900>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Nov 26 18:52:54.913: FAIL: Get "https://34.83.88.61/api/v1/namespaces/statefulset-7365/pods?labelSelector=baz%3Dblah%2Cfoo%3Dbar": dial tcp 34.83.88.61:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/framework/statefulset.GetPodList({0x801de88, 0xc003920340}, 0xc0008b3400) test/e2e/framework/statefulset/rest.go:69 +0x153 k8s.io/kubernetes/test/e2e/framework/statefulset.WaitForRunning.func1() test/e2e/framework/statefulset/wait.go:37 +0x4a k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.ConditionFunc.WithContext.func1({0x2742871, 0x0}) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:222 +0x1b k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.runConditionWithCrashProtectionWithContext({0x7fe0bc8?, 0xc0000820c8?}, 0x262a61f?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:235 +0x57 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0000820c8}, 0xc004594048, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:662 +0x10c k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0000820c8}, 0xf8?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 +0x9a k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0000820c8}, 0x65cbc00?, 0xc00424de48?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 +0x4a k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0x2c3?, 0x0?, 0x0?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 +0x50 k8s.io/kubernetes/test/e2e/framework/statefulset.WaitForRunning({0x801de88?, 0xc003920340}, 0x1, 0x1, 0xc0008b3400) test/e2e/framework/statefulset/wait.go:35 +0xbd k8s.io/kubernetes/test/e2e/framework/statefulset.WaitForRunningAndReady(...) test/e2e/framework/statefulset/wait.go:80 k8s.io/kubernetes/test/e2e/apps.glob..func10.2.11() test/e2e/apps/statefulset.go:708 +0x27b E1126 18:52:54.913486 10316 runtime.go:79] Observed a panic: types.GinkgoError{Heading:"Your Test Panicked", Message:"When you, or your assertion library, calls Ginkgo's Fail(),\nGinkgo panics to prevent subsequent assertions from running.\n\nNormally Ginkgo rescues this panic so you shouldn't see it.\n\nHowever, if you make an assertion in a goroutine, Ginkgo can't capture the panic.\nTo circumvent this, you should call\n\n\tdefer GinkgoRecover()\n\nat the top of the goroutine that caused this panic.\n\nAlternatively, you may have made an assertion outside of a Ginkgo\nleaf node (e.g. in a container node or some out-of-band function) - please move your assertion to\nan appropriate Ginkgo node (e.g. a BeforeSuite, BeforeEach, It, etc...).", DocLink:"mental-model-how-ginkgo-handles-failure", CodeLocation:types.CodeLocation{FileName:"test/e2e/framework/statefulset/rest.go", LineNumber:69, FullStackTrace:"k8s.io/kubernetes/test/e2e/framework/statefulset.GetPodList({0x801de88, 0xc003920340}, 0xc0008b3400)\n\ttest/e2e/framework/statefulset/rest.go:69 +0x153\nk8s.io/kubernetes/test/e2e/framework/statefulset.WaitForRunning.func1()\n\ttest/e2e/framework/statefulset/wait.go:37 +0x4a\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.ConditionFunc.WithContext.func1({0x2742871, 0x0})\n\tvendor/k8s.io/apimachinery/pkg/util/wait/wait.go:222 +0x1b\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.runConditionWithCrashProtectionWithContext({0x7fe0bc8?, 0xc0000820c8?}, 0x262a61f?)\n\tvendor/k8s.io/apimachinery/pkg/util/wait/wait.go:235 +0x57\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0000820c8}, 0xc004594048, 0x2fdb16a?)\n\tvendor/k8s.io/apimachinery/pkg/util/wait/wait.go:662 +0x10c\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0000820c8}, 0xf8?, 0x2fd9d05?, 0x20?)\n\tvendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 +0x9a\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0000820c8}, 0x65cbc00?, 0xc00424de48?, 0x262a967?)\n\tvendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 +0x4a\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0x2c3?, 0x0?, 0x0?)\n\tvendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 +0x50\nk8s.io/kubernetes/test/e2e/framework/statefulset.WaitForRunning({0x801de88?, 0xc003920340}, 0x1, 0x1, 0xc0008b3400)\n\ttest/e2e/framework/statefulset/wait.go:35 +0xbd\nk8s.io/kubernetes/test/e2e/framework/statefulset.WaitForRunningAndReady(...)\n\ttest/e2e/framework/statefulset/wait.go:80\nk8s.io/kubernetes/test/e2e/apps.glob..func10.2.11()\n\ttest/e2e/apps/statefulset.go:708 +0x27b", CustomMessage:""}} (�[1m�[38;5;9mYour Test Panicked�[0m �[38;5;243mtest/e2e/framework/statefulset/rest.go:69�[0m When you, or your assertion library, calls Ginkgo's Fail(), Ginkgo panics to prevent subsequent assertions from running. Normally Ginkgo rescues this panic so you shouldn't see it. However, if you make an assertion in a goroutine, Ginkgo can't capture the panic. To circumvent this, you should call defer GinkgoRecover() at the top of the goroutine that caused this panic. Alternatively, you may have made an assertion outside of a Ginkgo leaf node (e.g. in a container node or some out-of-band function) - please move your assertion to an appropriate Ginkgo node (e.g. a BeforeSuite, BeforeEach, It, etc...). �[1mLearn more at:�[0m �[38;5;14m�[4mhttp://onsi.github.io/ginkgo/#mental-model-how-ginkgo-handles-failure�[0m ) goroutine 4437 [running]: k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/runtime.logPanic({0x70eb7e0?, 0xc000304e70}) vendor/k8s.io/apimachinery/pkg/util/runtime/runtime.go:75 +0x99 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/runtime.HandleCrash({0x0, 0x0, 0xc000304e70?}) vendor/k8s.io/apimachinery/pkg/util/runtime/runtime.go:49 +0x75 panic({0x70eb7e0, 0xc000304e70}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2.Fail({0xc004b7c300, 0xb2}, {0xc004dc35a8?, 0x75b521a?, 0xc004dc35c8?}) vendor/github.com/onsi/ginkgo/v2/core_dsl.go:352 +0x225 k8s.io/kubernetes/test/e2e/framework.Fail({0xc0000948c0, 0x9d}, {0xc004dc3640?, 0xc0000948c0?, 0xc004dc3668?}) test/e2e/framework/log.go:61 +0x145 k8s.io/kubernetes/test/e2e/framework.ExpectNoErrorWithOffset(0x1, {0x7fadf60, 0xc0040bb3e0}, {0x0?, 0xc003290460?, 0x10?}) test/e2e/framework/expect.go:76 +0x267 k8s.io/kubernetes/test/e2e/framework.ExpectNoError(...) test/e2e/framework/expect.go:43 k8s.io/kubernetes/test/e2e/framework/statefulset.GetPodList({0x801de88, 0xc003920340}, 0xc0008b3400) test/e2e/framework/statefulset/rest.go:69 +0x153 k8s.io/kubernetes/test/e2e/framework/statefulset.WaitForRunning.func1() test/e2e/framework/statefulset/wait.go:37 +0x4a k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.ConditionFunc.WithContext.func1({0x2742871, 0x0}) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:222 +0x1b k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.runConditionWithCrashProtectionWithContext({0x7fe0bc8?, 0xc0000820c8?}, 0x262a61f?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:235 +0x57 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0000820c8}, 0xc004594048, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:662 +0x10c k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0000820c8}, 0xf8?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 +0x9a k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0000820c8}, 0x65cbc00?, 0xc00424de48?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 +0x4a k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0x2c3?, 0x0?, 0x0?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 +0x50 k8s.io/kubernetes/test/e2e/framework/statefulset.WaitForRunning({0x801de88?, 0xc003920340}, 0x1, 0x1, 0xc0008b3400) test/e2e/framework/statefulset/wait.go:35 +0xbd k8s.io/kubernetes/test/e2e/framework/statefulset.WaitForRunningAndReady(...) test/e2e/framework/statefulset/wait.go:80 k8s.io/kubernetes/test/e2e/apps.glob..func10.2.11() test/e2e/apps/statefulset.go:708 +0x27b k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0x2d591ce, 0xc002e1d500}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 +0x1b k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 +0x98 created by k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 +0xe3d [AfterEach] Basic StatefulSet functionality [StatefulSetBasic] test/e2e/apps/statefulset.go:124 Nov 26 18:52:54.953: INFO: Deleting all statefulset in ns statefulset-7365 Nov 26 18:52:54.993: INFO: Unexpected error: <*url.Error | 0xc004efa660>: { Op: "Get", URL: "https://34.83.88.61/apis/apps/v1/namespaces/statefulset-7365/statefulsets", Err: <*net.OpError | 0xc003f29450>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc00517f710>{ IP: [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 255, 255, 34, 83, 88, 61], Port: 443, Zone: "", }, Err: <*os.SyscallError | 0xc002d7ce60>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Nov 26 18:52:54.994: FAIL: Get "https://34.83.88.61/apis/apps/v1/namespaces/statefulset-7365/statefulsets": dial tcp 34.83.88.61:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/framework/statefulset.DeleteAllStatefulSets({0x801de88, 0xc003920340}, {0xc003290150, 0x10}) test/e2e/framework/statefulset/rest.go:76 +0x113 k8s.io/kubernetes/test/e2e/apps.glob..func10.2.2() test/e2e/apps/statefulset.go:129 +0x1b2 [AfterEach] [sig-apps] StatefulSet test/e2e/framework/node/init/init.go:32 Nov 26 18:52:54.994: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready [DeferCleanup (Each)] [sig-apps] StatefulSet test/e2e/framework/metrics/init/init.go:33 [DeferCleanup (Each)] [sig-apps] StatefulSet dump namespaces | framework.go:196 STEP: dump namespace information after failure 11/26/22 18:52:55.033 STEP: Collecting events from namespace "statefulset-7365". 11/26/22 18:52:55.033 Nov 26 18:52:55.073: INFO: Unexpected error: failed to list events in namespace "statefulset-7365": <*url.Error | 0xc004efac00>: { Op: "Get", URL: "https://34.83.88.61/api/v1/namespaces/statefulset-7365/events", Err: <*net.OpError | 0xc003f29680>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc00517fc80>{ IP: [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 255, 255, 34, 83, 88, 61], Port: 443, Zone: "", }, Err: <*os.SyscallError | 0xc002d7d1c0>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Nov 26 18:52:55.073: FAIL: failed to list events in namespace "statefulset-7365": Get "https://34.83.88.61/api/v1/namespaces/statefulset-7365/events": dial tcp 34.83.88.61:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/framework/debug.dumpEventsInNamespace(0xc0029565c0, {0xc003290150, 0x10}) test/e2e/framework/debug/dump.go:44 +0x191 k8s.io/kubernetes/test/e2e/framework/debug.DumpAllNamespaceInfo({0x801de88, 0xc003920340}, {0xc003290150, 0x10}) test/e2e/framework/debug/dump.go:62 +0x8d k8s.io/kubernetes/test/e2e/framework/debug/init.init.0.func1.1(0xc002956650?, {0xc003290150?, 0x7fa7740?}) test/e2e/framework/debug/init/init.go:34 +0x32 k8s.io/kubernetes/test/e2e/framework.(*Framework).dumpNamespaceInfo.func1() test/e2e/framework/framework.go:274 +0x6d k8s.io/kubernetes/test/e2e/framework.(*Framework).dumpNamespaceInfo(0xc0011701e0) test/e2e/framework/framework.go:271 +0x179 reflect.Value.call({0x6627cc0?, 0xc0049405f0?, 0xc0040c8fb0?}, {0x75b6e72, 0x4}, {0xae73300, 0x0, 0xc0037ce708?}) /usr/local/go/src/reflect/value.go:584 +0x8c5 reflect.Value.Call({0x6627cc0?, 0xc0049405f0?, 0x29449fc?}, {0xae73300?, 0xc0040c8f80?, 0x0?}) /usr/local/go/src/reflect/value.go:368 +0xbc [DeferCleanup (Each)] [sig-apps] StatefulSet tear down framework | framework.go:193 STEP: Destroying namespace "statefulset-7365" for this suite. 11/26/22 18:52:55.074 Nov 26 18:52:55.117: FAIL: Couldn't delete ns: "statefulset-7365": Delete "https://34.83.88.61/api/v1/namespaces/statefulset-7365": dial tcp 34.83.88.61:443: connect: connection refused (&url.Error{Op:"Delete", URL:"https://34.83.88.61/api/v1/namespaces/statefulset-7365", Err:(*net.OpError)(0xc004c9b630)}) Full Stack Trace k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach.func1() test/e2e/framework/framework.go:370 +0x4fe k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach(0xc0011701e0) test/e2e/framework/framework.go:383 +0x1ca reflect.Value.call({0x6627cc0?, 0xc004940570?, 0xc004cf8fb0?}, {0x75b6e72, 0x4}, {0xae73300, 0x0, 0x0?}) /usr/local/go/src/reflect/value.go:584 +0x8c5 reflect.Value.Call({0x6627cc0?, 0xc004940570?, 0x0?}, {0xae73300?, 0x5?, 0xc00401f200?}) /usr/local/go/src/reflect/value.go:368 +0xbc
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-apps\]\sStatefulSet\sBasic\sStatefulSet\sfunctionality\s\[StatefulSetBasic\]\sScaling\sshould\shappen\sin\spredictable\sorder\sand\shalt\sif\sany\sstateful\spod\sis\sunhealthy\s\[Slow\]\s\[Conformance\]$'
test/e2e/framework/statefulset/rest.go:69 k8s.io/kubernetes/test/e2e/framework/statefulset.GetPodList({0x801de88, 0xc004cb5a00}, 0xc000aecf00) test/e2e/framework/statefulset/rest.go:69 +0x153 k8s.io/kubernetes/test/e2e/framework/statefulset.WaitForRunning.func1() test/e2e/framework/statefulset/wait.go:37 +0x4a k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.ConditionFunc.WithContext.func1({0x18, 0xc0000bf800}) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:222 +0x1b k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.runConditionWithCrashProtectionWithContext({0x7fe0bc8?, 0xc0001b0000?}, 0xc00186dd50?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:235 +0x57 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x90?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:582 +0x38 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0x1?, 0xc00186dde0?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 +0x4a k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0x801de88?, 0xc004cb5a00?, 0xc00186de20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 +0x50 k8s.io/kubernetes/test/e2e/framework/statefulset.WaitForRunning({0x801de88?, 0xc004cb5a00}, 0x1, 0x0, 0xc000aecf00) test/e2e/framework/statefulset/wait.go:35 +0xbd k8s.io/kubernetes/test/e2e/apps.waitForRunningAndNotReady(...) test/e2e/apps/wait.go:154 k8s.io/kubernetes/test/e2e/apps.glob..func10.2.10() test/e2e/apps/statefulset.go:636 +0x5d6 There were additional failures detected after the initial failure: [FAILED] Nov 26 18:56:30.440: Get "https://34.83.88.61/apis/apps/v1/namespaces/statefulset-1262/statefulsets": dial tcp 34.83.88.61:443: connect: connection refused In [AfterEach] at: test/e2e/framework/statefulset/rest.go:76 ---------- [FAILED] Nov 26 18:56:30.520: failed to list events in namespace "statefulset-1262": Get "https://34.83.88.61/api/v1/namespaces/statefulset-1262/events": dial tcp 34.83.88.61:443: connect: connection refused In [DeferCleanup (Each)] at: test/e2e/framework/debug/dump.go:44 ---------- [FAILED] Nov 26 18:56:30.560: Couldn't delete ns: "statefulset-1262": Delete "https://34.83.88.61/api/v1/namespaces/statefulset-1262": dial tcp 34.83.88.61:443: connect: connection refused (&url.Error{Op:"Delete", URL:"https://34.83.88.61/api/v1/namespaces/statefulset-1262", Err:(*net.OpError)(0xc0034cc3c0)}) In [DeferCleanup (Each)] at: test/e2e/framework/framework.go:370from junit_01.xml
[BeforeEach] [sig-apps] StatefulSet set up framework | framework.go:178 STEP: Creating a kubernetes client 11/26/22 18:49:43.563 Nov 26 18:49:43.563: INFO: >>> kubeConfig: /workspace/.kube/config STEP: Building a namespace api object, basename statefulset 11/26/22 18:49:43.564 Nov 26 18:49:43.604: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:45.644: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:47.643: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:49.643: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:51.644: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:53.643: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:55.644: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:57.644: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:59.644: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:50:01.644: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:50:03.644: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:50:05.644: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:50:07.644: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused STEP: Waiting for a default service account to be provisioned in namespace 11/26/22 18:51:17.569 STEP: Waiting for kube-root-ca.crt to be provisioned in namespace 11/26/22 18:51:17.653 [BeforeEach] [sig-apps] StatefulSet test/e2e/framework/metrics/init/init.go:31 [BeforeEach] [sig-apps] StatefulSet test/e2e/apps/statefulset.go:98 [BeforeEach] Basic StatefulSet functionality [StatefulSetBasic] test/e2e/apps/statefulset.go:113 STEP: Creating service test in namespace statefulset-1262 11/26/22 18:51:19.855 [It] Scaling should happen in predictable order and halt if any stateful pod is unhealthy [Slow] [Conformance] test/e2e/apps/statefulset.go:587 STEP: Initializing watcher for selector baz=blah,foo=bar 11/26/22 18:51:19.899 STEP: Creating stateful set ss in namespace statefulset-1262 11/26/22 18:51:19.94 STEP: Waiting until all stateful set ss replicas will be running in namespace statefulset-1262 11/26/22 18:51:19.983 Nov 26 18:51:20.029: INFO: Waiting for pod ss-0 to enter Running - Ready=true, currently Pending - Ready=false Nov 26 18:51:30.091: INFO: Waiting for pod ss-0 to enter Running - Ready=true, currently Running - Ready=true STEP: Confirming that stateful set scale up will halt with unhealthy stateful pod 11/26/22 18:51:30.091 Nov 26 18:51:30.150: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true' Nov 26 18:51:31.081: INFO: rc: 1 Nov 26 18:51:31.081: INFO: Waiting 10s to retry failed RunHostCmd: error running /workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true: Command stdout: stderr: error: Internal error occurred: error executing command in container: failed to exec in container: container is in CONTAINER_EXITED state error: exit status 1 Nov 26 18:51:41.082: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true' Nov 26 18:51:41.782: INFO: rc: 1 Nov 26 18:51:41.782: INFO: Waiting 10s to retry failed RunHostCmd: error running /workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true: Command stdout: stderr: error: unable to upgrade connection: container not found ("webserver") error: exit status 1 Nov 26 18:51:51.782: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true' Nov 26 18:51:52.292: INFO: rc: 1 Nov 26 18:51:52.293: INFO: Waiting 10s to retry failed RunHostCmd: error running /workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true: Command stdout: stderr: Error from server: error dialing backend: No agent available error: exit status 1 Nov 26 18:52:02.293: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true' Nov 26 18:52:02.815: INFO: rc: 1 Nov 26 18:52:02.815: INFO: Waiting 10s to retry failed RunHostCmd: error running /workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true: Command stdout: stderr: Error from server: error dialing backend: No agent available error: exit status 1 Nov 26 18:52:12.815: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true' Nov 26 18:52:13.263: INFO: rc: 1 Nov 26 18:52:13.263: INFO: Waiting 10s to retry failed RunHostCmd: error running /workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true: Command stdout: stderr: Error from server: error dialing backend: No agent available error: exit status 1 Nov 26 18:52:23.264: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true' Nov 26 18:52:23.657: INFO: rc: 1 Nov 26 18:52:23.658: INFO: Waiting 10s to retry failed RunHostCmd: error running /workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true: Command stdout: stderr: Error from server: error dialing backend: No agent available error: exit status 1 Nov 26 18:52:33.659: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true' Nov 26 18:52:34.056: INFO: rc: 1 Nov 26 18:52:34.056: INFO: Waiting 10s to retry failed RunHostCmd: error running /workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true: Command stdout: stderr: Error from server: error dialing backend: No agent available error: exit status 1 Nov 26 18:52:44.056: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true' Nov 26 18:52:44.497: INFO: rc: 1 Nov 26 18:52:44.497: INFO: Waiting 10s to retry failed RunHostCmd: error running /workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true: Command stdout: stderr: error: unable to upgrade connection: container not found ("webserver") error: exit status 1 Nov 26 18:52:54.497: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true' Nov 26 18:52:54.661: INFO: rc: 1 Nov 26 18:52:54.661: INFO: Waiting 10s to retry failed RunHostCmd: error running /workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true: Command stdout: stderr: The connection to the server 34.83.88.61 was refused - did you specify the right host or port? error: exit status 1 Nov 26 18:53:04.662: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true' Nov 26 18:53:04.777: INFO: rc: 1 Nov 26 18:53:04.777: INFO: Waiting 10s to retry failed RunHostCmd: error running /workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true: Command stdout: stderr: The connection to the server 34.83.88.61 was refused - did you specify the right host or port? error: exit status 1 Nov 26 18:53:14.777: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true' Nov 26 18:53:14.890: INFO: rc: 1 Nov 26 18:53:14.890: INFO: Waiting 10s to retry failed RunHostCmd: error running /workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true: Command stdout: stderr: The connection to the server 34.83.88.61 was refused - did you specify the right host or port? error: exit status 1 E1126 18:53:18.289007 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:19.290152 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:20.291448 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:21.292231 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:22.293145 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:23.294088 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:24.294075 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" Nov 26 18:53:24.891: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true' Nov 26 18:53:25.034: INFO: rc: 1 Nov 26 18:53:25.034: INFO: Waiting 10s to retry failed RunHostCmd: error running /workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true: Command stdout: stderr: The connection to the server 34.83.88.61 was refused - did you specify the right host or port? error: exit status 1 E1126 18:53:25.295289 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:26.295101 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:27.295804 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:28.296480 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:29.296941 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:30.297907 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:31.298995 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:32.299600 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:33.300155 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:34.301104 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" Nov 26 18:53:35.035: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true' Nov 26 18:53:35.165: INFO: rc: 1 Nov 26 18:53:35.165: INFO: Waiting 10s to retry failed RunHostCmd: error running /workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true: Command stdout: stderr: The connection to the server 34.83.88.61 was refused - did you specify the right host or port? error: exit status 1 E1126 18:53:35.301786 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:36.302761 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:37.303717 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:38.304271 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:39.304613 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:40.305310 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:41.306482 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:42.307174 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:43.308065 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:44.308085 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" Nov 26 18:53:45.165: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true' Nov 26 18:53:45.283: INFO: rc: 1 Nov 26 18:53:45.283: INFO: Waiting 10s to retry failed RunHostCmd: error running /workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true: Command stdout: stderr: The connection to the server 34.83.88.61 was refused - did you specify the right host or port? error: exit status 1 E1126 18:53:45.308624 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:46.309442 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:47.310164 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:48.311051 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:49.311972 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:50.312678 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:51.312808 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:52.313339 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:53.313824 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:54.315501 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" Nov 26 18:53:55.284: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true' E1126 18:53:55.315308 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" Nov 26 18:53:55.412: INFO: rc: 1 Nov 26 18:53:55.412: INFO: Waiting 10s to retry failed RunHostCmd: error running /workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true: Command stdout: stderr: The connection to the server 34.83.88.61 was refused - did you specify the right host or port? error: exit status 1 E1126 18:53:56.315986 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:57.316913 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:58.317266 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:53:59.318154 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:00.319168 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:01.320026 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:02.319994 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:03.320752 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:04.321614 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:05.322373 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" Nov 26 18:54:05.413: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true' Nov 26 18:54:05.535: INFO: rc: 1 Nov 26 18:54:05.535: INFO: Waiting 10s to retry failed RunHostCmd: error running /workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true: Command stdout: stderr: The connection to the server 34.83.88.61 was refused - did you specify the right host or port? error: exit status 1 E1126 18:54:06.322478 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:07.323010 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:08.322949 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:09.323668 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:10.323933 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:11.324465 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:12.325599 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:13.325706 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:14.326610 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:15.327663 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" Nov 26 18:54:15.536: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true' Nov 26 18:54:15.654: INFO: rc: 1 Nov 26 18:54:15.654: INFO: Waiting 10s to retry failed RunHostCmd: error running /workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true: Command stdout: stderr: The connection to the server 34.83.88.61 was refused - did you specify the right host or port? error: exit status 1 E1126 18:54:16.328517 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:17.328887 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:18.329020 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:19.328909 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:20.329820 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:21.330849 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:22.331673 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:23.332455 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:24.333272 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:25.334202 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" Nov 26 18:54:25.655: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true' Nov 26 18:54:25.785: INFO: rc: 1 Nov 26 18:54:25.785: INFO: Waiting 10s to retry failed RunHostCmd: error running /workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true: Command stdout: stderr: The connection to the server 34.83.88.61 was refused - did you specify the right host or port? error: exit status 1 E1126 18:54:26.335422 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:27.336073 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:28.336832 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:29.337033 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:30.338011 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:31.338959 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:32.339789 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:33.340332 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:34.341320 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:35.341463 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" Nov 26 18:54:35.786: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true' Nov 26 18:54:35.920: INFO: rc: 1 Nov 26 18:54:35.920: INFO: Waiting 10s to retry failed RunHostCmd: error running /workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true: Command stdout: stderr: The connection to the server 34.83.88.61 was refused - did you specify the right host or port? error: exit status 1 E1126 18:54:36.342044 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:37.342690 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:38.342394 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:39.342971 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:40.343597 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:41.344140 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:42.345401 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:43.346303 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:44.346401 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" Nov 26 18:54:45.920: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true' E1126 18:54:49.040728 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused - error from a previous attempt: read tcp 10.60.55.171:41394->34.83.88.61:443: read: connection reset by peer" Nov 26 18:54:49.046: INFO: rc: 1 Nov 26 18:54:49.046: INFO: Waiting 10s to retry failed RunHostCmd: error running /workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true: Command stdout: stderr: error: Get "https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods/ss-0": dial tcp 34.83.88.61:443: connect: connection refused - error from a previous attempt: read tcp 10.60.55.171:41418->34.83.88.61:443: read: connection reset by peer error: exit status 1 E1126 18:54:49.080142 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:50.081168 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:51.081800 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:52.082040 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:53.082716 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:54.083273 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:55.084034 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:56.084386 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:57.084566 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:54:58.084692 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" Nov 26 18:54:59.047: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true' E1126 18:54:59.085603 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" Nov 26 18:54:59.173: INFO: rc: 1 Nov 26 18:54:59.173: INFO: Waiting 10s to retry failed RunHostCmd: error running /workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true: Command stdout: stderr: The connection to the server 34.83.88.61 was refused - did you specify the right host or port? error: exit status 1 E1126 18:55:00.085994 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:01.086333 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:02.086569 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:03.087622 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:04.088188 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:05.088497 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:06.089491 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:07.089292 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:08.089654 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:09.089754 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" Nov 26 18:55:09.174: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true' Nov 26 18:55:09.299: INFO: rc: 1 Nov 26 18:55:09.299: INFO: Waiting 10s to retry failed RunHostCmd: error running /workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true: Command stdout: stderr: The connection to the server 34.83.88.61 was refused - did you specify the right host or port? error: exit status 1 E1126 18:55:10.091378 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:11.090832 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:12.090968 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:13.091221 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:14.091700 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:15.092141 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:16.092026 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:17.092375 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:18.092581 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:19.092244 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" Nov 26 18:55:19.299: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true' Nov 26 18:55:19.434: INFO: rc: 1 Nov 26 18:55:19.434: INFO: Waiting 10s to retry failed RunHostCmd: error running /workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true: Command stdout: stderr: The connection to the server 34.83.88.61 was refused - did you specify the right host or port? error: exit status 1 E1126 18:55:20.093179 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:21.093692 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:22.093914 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:23.094760 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:24.095695 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:25.096360 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:26.097365 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:27.098046 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:28.098662 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:29.099692 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" Nov 26 18:55:29.434: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true' Nov 26 18:55:29.556: INFO: rc: 1 Nov 26 18:55:29.556: INFO: Waiting 10s to retry failed RunHostCmd: error running /workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true: Command stdout: stderr: The connection to the server 34.83.88.61 was refused - did you specify the right host or port? error: exit status 1 E1126 18:55:30.100034 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:31.100744 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:32.101167 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:33.101506 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:34.101935 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:35.102668 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:36.103401 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:37.104286 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:38.105138 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:39.106101 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" Nov 26 18:55:39.557: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true' Nov 26 18:55:39.678: INFO: rc: 1 Nov 26 18:55:39.678: INFO: Waiting 10s to retry failed RunHostCmd: error running /workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true: Command stdout: stderr: The connection to the server 34.83.88.61 was refused - did you specify the right host or port? error: exit status 1 E1126 18:55:40.106527 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:41.107255 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:42.107011 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:43.107788 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:44.108655 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:45.109589 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:46.110339 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:47.110610 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:48.111000 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:49.111801 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" Nov 26 18:55:49.679: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true' Nov 26 18:55:49.810: INFO: rc: 1 Nov 26 18:55:49.810: INFO: Waiting 10s to retry failed RunHostCmd: error running /workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true: Command stdout: stderr: The connection to the server 34.83.88.61 was refused - did you specify the right host or port? error: exit status 1 E1126 18:55:50.112478 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:51.113393 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:52.113426 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:53.114114 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:54.114377 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:55.115140 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:56.116449 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:57.117109 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:58.118058 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:55:59.119022 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" Nov 26 18:55:59.812: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true' Nov 26 18:55:59.938: INFO: rc: 1 Nov 26 18:55:59.938: INFO: Waiting 10s to retry failed RunHostCmd: error running /workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true: Command stdout: stderr: The connection to the server 34.83.88.61 was refused - did you specify the right host or port? error: exit status 1 E1126 18:56:00.120026 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:56:01.120057 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:56:02.120985 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:56:03.121687 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:56:04.122927 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:56:05.123013 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:56:06.124014 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:56:07.124997 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:56:08.125483 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:56:09.126298 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" Nov 26 18:56:09.939: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true' Nov 26 18:56:10.063: INFO: rc: 1 Nov 26 18:56:10.063: INFO: Waiting 10s to retry failed RunHostCmd: error running /workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true: Command stdout: stderr: The connection to the server 34.83.88.61 was refused - did you specify the right host or port? error: exit status 1 E1126 18:56:10.127499 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:56:11.128122 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:56:12.129360 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:56:13.129273 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:56:14.129753 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:56:15.129714 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:56:16.130146 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:56:17.130967 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:56:18.131142 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:56:19.131664 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" ------------------------------ Progress Report for Ginkgo Process #9 Automatically polling progress: [sig-apps] StatefulSet Basic StatefulSet functionality [StatefulSetBasic] Scaling should happen in predictable order and halt if any stateful pod is unhealthy [Slow] [Conformance] (Spec Runtime: 6m36.338s) test/e2e/apps/statefulset.go:587 In [It] (Node Runtime: 5m0.001s) test/e2e/apps/statefulset.go:587 At [By Step] Confirming that stateful set scale up will halt with unhealthy stateful pod (Step Runtime: 4m49.809s) test/e2e/apps/statefulset.go:634 Spec Goroutine goroutine 2547 [sleep] time.Sleep(0x2540be400) /usr/local/go/src/runtime/time.go:195 k8s.io/kubernetes/test/e2e/framework/pod/output.RunHostCmdWithRetries({0xc004f0a2f0, 0x10}, {0xc004f0a2cc, 0x4}, {0xc004ce8500, 0x38}, 0xc003484cd0?, 0x45d964b800) test/e2e/framework/pod/output/output.go:113 k8s.io/kubernetes/test/e2e/framework/statefulset.ExecInStatefulPods({0x801de88?, 0xc004cb5a00?}, 0xc00186de20?, {0xc004ce8500, 0x38}) test/e2e/framework/statefulset/rest.go:240 > k8s.io/kubernetes/test/e2e/apps.breakHTTPProbe({0x801de88, 0xc004cb5a00}, 0x0?) test/e2e/apps/statefulset.go:1704 > k8s.io/kubernetes/test/e2e/apps.glob..func10.2.10() test/e2e/apps/statefulset.go:635 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0x2d591ce, 0xc0009d1380}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 Goroutines of Interest goroutine 2549 [select, 4 minutes] k8s.io/kubernetes/vendor/k8s.io/client-go/tools/watch.UntilWithoutRetry({0x7fe0c00, 0xc004b3b140}, {0x7fbcaa0, 0xc000e58100}, {0xc0040b5f38, 0x1, 0x2?}) vendor/k8s.io/client-go/tools/watch/until.go:73 k8s.io/kubernetes/vendor/k8s.io/client-go/tools/watch.Until({0x7fe0c00, 0xc004b3b140}, {0xc0022f1a98?, 0x75b5158?}, {0x7facee0?, 0xc001239f38?}, {0xc0040b5f38, 0x1, 0x1}) vendor/k8s.io/client-go/tools/watch/until.go:114 > k8s.io/kubernetes/test/e2e/apps.glob..func10.2.10.2() test/e2e/apps/statefulset.go:613 > k8s.io/kubernetes/test/e2e/apps.glob..func10.2.10 test/e2e/apps/statefulset.go:605 ------------------------------ Nov 26 18:56:20.064: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true' E1126 18:56:20.132881 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" Nov 26 18:56:20.190: INFO: rc: 1 Nov 26 18:56:20.190: INFO: Waiting 10s to retry failed RunHostCmd: error running /workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true: Command stdout: stderr: The connection to the server 34.83.88.61 was refused - did you specify the right host or port? error: exit status 1 E1126 18:56:21.133158 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:56:22.133861 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:56:23.134979 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:56:24.135202 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:56:25.136072 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:56:26.136036 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:56:27.136288 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:56:28.137023 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:56:29.137893 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" E1126 18:56:30.138174 10178 retrywatcher.go:130] "Watch failed" err="Get \"https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?allowWatchBookmarks=true&labelSelector=baz%3Dblah%2Cfoo%3Dbar&resourceVersion=8660&watch=true\": dial tcp 34.83.88.61:443: connect: connection refused" Nov 26 18:56:30.191: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=statefulset-1262 exec ss-0 -- /bin/sh -x -c mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true' Nov 26 18:56:30.320: INFO: rc: 1 Nov 26 18:56:30.320: INFO: stdout of mv -v /usr/local/apache2/htdocs/index.html /tmp/ || true on ss-0: Nov 26 18:56:30.360: INFO: Unexpected error: <*url.Error | 0xc003434f30>: { Op: "Get", URL: "https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?labelSelector=baz%3Dblah%2Cfoo%3Dbar", Err: <*net.OpError | 0xc00205bb30>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc0028fa570>{ IP: [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 255, 255, 34, 83, 88, 61], Port: 443, Zone: "", }, Err: <*os.SyscallError | 0xc0040f3840>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Nov 26 18:56:30.360: FAIL: Get "https://34.83.88.61/api/v1/namespaces/statefulset-1262/pods?labelSelector=baz%3Dblah%2Cfoo%3Dbar": dial tcp 34.83.88.61:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/framework/statefulset.GetPodList({0x801de88, 0xc004cb5a00}, 0xc000aecf00) test/e2e/framework/statefulset/rest.go:69 +0x153 k8s.io/kubernetes/test/e2e/framework/statefulset.WaitForRunning.func1() test/e2e/framework/statefulset/wait.go:37 +0x4a k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.ConditionFunc.WithContext.func1({0x18, 0xc0000bf800}) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:222 +0x1b k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.runConditionWithCrashProtectionWithContext({0x7fe0bc8?, 0xc0001b0000?}, 0xc00186dd50?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:235 +0x57 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x90?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:582 +0x38 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0x1?, 0xc00186dde0?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 +0x4a k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0x801de88?, 0xc004cb5a00?, 0xc00186de20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 +0x50 k8s.io/kubernetes/test/e2e/framework/statefulset.WaitForRunning({0x801de88?, 0xc004cb5a00}, 0x1, 0x0, 0xc000aecf00) test/e2e/framework/statefulset/wait.go:35 +0xbd k8s.io/kubernetes/test/e2e/apps.waitForRunningAndNotReady(...) test/e2e/apps/wait.go:154 k8s.io/kubernetes/test/e2e/apps.glob..func10.2.10() test/e2e/apps/statefulset.go:636 +0x5d6 E1126 18:56:30.360597 10178 runtime.go:79] Observed a panic: types.GinkgoError{Heading:"Your Test Panicked", Message:"When you, or your assertion library, calls Ginkgo's Fail(),\nGinkgo panics to prevent subsequent assertions from running.\n\nNormally Ginkgo rescues this panic so you shouldn't see it.\n\nHowever, if you make an assertion in a goroutine, Ginkgo can't capture the panic.\nTo circumvent this, you should call\n\n\tdefer GinkgoRecover()\n\nat the top of the goroutine that caused this panic.\n\nAlternatively, you may have made an assertion outside of a Ginkgo\nleaf node (e.g. in a container node or some out-of-band function) - please move your assertion to\nan appropriate Ginkgo node (e.g. a BeforeSuite, BeforeEach, It, etc...).", DocLink:"mental-model-how-ginkgo-handles-failure", CodeLocation:types.CodeLocation{FileName:"test/e2e/framework/statefulset/rest.go", LineNumber:69, FullStackTrace:"k8s.io/kubernetes/test/e2e/framework/statefulset.GetPodList({0x801de88, 0xc004cb5a00}, 0xc000aecf00)\n\ttest/e2e/framework/statefulset/rest.go:69 +0x153\nk8s.io/kubernetes/test/e2e/framework/statefulset.WaitForRunning.func1()\n\ttest/e2e/framework/statefulset/wait.go:37 +0x4a\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.ConditionFunc.WithContext.func1({0x18, 0xc0000bf800})\n\tvendor/k8s.io/apimachinery/pkg/util/wait/wait.go:222 +0x1b\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.runConditionWithCrashProtectionWithContext({0x7fe0bc8?, 0xc0001b0000?}, 0xc00186dd50?)\n\tvendor/k8s.io/apimachinery/pkg/util/wait/wait.go:235 +0x57\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x90?, 0x2fd9d05?, 0x20?)\n\tvendor/k8s.io/apimachinery/pkg/util/wait/wait.go:582 +0x38\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0x1?, 0xc00186dde0?, 0x262a967?)\n\tvendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 +0x4a\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0x801de88?, 0xc004cb5a00?, 0xc00186de20?)\n\tvendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 +0x50\nk8s.io/kubernetes/test/e2e/framework/statefulset.WaitForRunning({0x801de88?, 0xc004cb5a00}, 0x1, 0x0, 0xc000aecf00)\n\ttest/e2e/framework/statefulset/wait.go:35 +0xbd\nk8s.io/kubernetes/test/e2e/apps.waitForRunningAndNotReady(...)\n\ttest/e2e/apps/wait.go:154\nk8s.io/kubernetes/test/e2e/apps.glob..func10.2.10()\n\ttest/e2e/apps/statefulset.go:636 +0x5d6", CustomMessage:""}} (�[1m�[38;5;9mYour Test Panicked�[0m �[38;5;243mtest/e2e/framework/statefulset/rest.go:69�[0m When you, or your assertion library, calls Ginkgo's Fail(), Ginkgo panics to prevent subsequent assertions from running. Normally Ginkgo rescues this panic so you shouldn't see it. However, if you make an assertion in a goroutine, Ginkgo can't capture the panic. To circumvent this, you should call defer GinkgoRecover() at the top of the goroutine that caused this panic. Alternatively, you may have made an assertion outside of a Ginkgo leaf node (e.g. in a container node or some out-of-band function) - please move your assertion to an appropriate Ginkgo node (e.g. a BeforeSuite, BeforeEach, It, etc...). �[1mLearn more at:�[0m �[38;5;14m�[4mhttp://onsi.github.io/ginkgo/#mental-model-how-ginkgo-handles-failure�[0m ) goroutine 2547 [running]: k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/runtime.logPanic({0x70eb7e0?, 0xc0001d8070}) vendor/k8s.io/apimachinery/pkg/util/runtime/runtime.go:75 +0x99 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/runtime.HandleCrash({0x0, 0x0, 0x26417e7?}) vendor/k8s.io/apimachinery/pkg/util/runtime/runtime.go:49 +0x75 panic({0x70eb7e0, 0xc0001d8070}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2.Fail({0xc00409c840, 0xb2}, {0xc000f775d8?, 0x75b521a?, 0xc000f775f8?}) vendor/github.com/onsi/ginkgo/v2/core_dsl.go:352 +0x225 k8s.io/kubernetes/test/e2e/framework.Fail({0xc000095900, 0x9d}, {0xc000f77670?, 0xc000095900?, 0xc000f77698?}) test/e2e/framework/log.go:61 +0x145 k8s.io/kubernetes/test/e2e/framework.ExpectNoErrorWithOffset(0x1, {0x7fadf60, 0xc003434f30}, {0x0?, 0xc002595380?, 0x10?}) test/e2e/framework/expect.go:76 +0x267 k8s.io/kubernetes/test/e2e/framework.ExpectNoError(...) test/e2e/framework/expect.go:43 k8s.io/kubernetes/test/e2e/framework/statefulset.GetPodList({0x801de88, 0xc004cb5a00}, 0xc000aecf00) test/e2e/framework/statefulset/rest.go:69 +0x153 k8s.io/kubernetes/test/e2e/framework/statefulset.WaitForRunning.func1() test/e2e/framework/statefulset/wait.go:37 +0x4a k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.ConditionFunc.WithContext.func1({0x18, 0xc0000bf800}) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:222 +0x1b k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.runConditionWithCrashProtectionWithContext({0x7fe0bc8?, 0xc0001b0000?}, 0xc00186dd50?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:235 +0x57 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x90?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:582 +0x38 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0x1?, 0xc00186dde0?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 +0x4a k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0x801de88?, 0xc004cb5a00?, 0xc00186de20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 +0x50 k8s.io/kubernetes/test/e2e/framework/statefulset.WaitForRunning({0x801de88?, 0xc004cb5a00}, 0x1, 0x0, 0xc000aecf00) test/e2e/framework/statefulset/wait.go:35 +0xbd k8s.io/kubernetes/test/e2e/apps.waitForRunningAndNotReady(...) test/e2e/apps/wait.go:154 k8s.io/kubernetes/test/e2e/apps.glob..func10.2.10() test/e2e/apps/statefulset.go:636 +0x5d6 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0x2d591ce, 0xc0009d1380}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 +0x1b k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 +0x98 created by k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 +0xe3d [AfterEach] Basic StatefulSet functionality [StatefulSetBasic] test/e2e/apps/statefulset.go:124 Nov 26 18:56:30.400: INFO: Deleting all statefulset in ns statefulset-1262 Nov 26 18:56:30.440: INFO: Unexpected error: <*url.Error | 0xc003435500>: { Op: "Get", URL: "https://34.83.88.61/apis/apps/v1/namespaces/statefulset-1262/statefulsets", Err: <*net.OpError | 0xc00205bf40>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc0025cc450>{ IP: [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 255, 255, 34, 83, 88, 61], Port: 443, Zone: "", }, Err: <*os.SyscallError | 0xc0040f3c80>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Nov 26 18:56:30.440: FAIL: Get "https://34.83.88.61/apis/apps/v1/namespaces/statefulset-1262/statefulsets": dial tcp 34.83.88.61:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/framework/statefulset.DeleteAllStatefulSets({0x801de88, 0xc004cb5a00}, {0xc0022f0c80, 0x10}) test/e2e/framework/statefulset/rest.go:76 +0x113 k8s.io/kubernetes/test/e2e/apps.glob..func10.2.2() test/e2e/apps/statefulset.go:129 +0x1b2 [AfterEach] [sig-apps] StatefulSet test/e2e/framework/node/init/init.go:32 Nov 26 18:56:30.440: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready [DeferCleanup (Each)] [sig-apps] StatefulSet test/e2e/framework/metrics/init/init.go:33 [DeferCleanup (Each)] [sig-apps] StatefulSet dump namespaces | framework.go:196 STEP: dump namespace information after failure 11/26/22 18:56:30.48 STEP: Collecting events from namespace "statefulset-1262". 11/26/22 18:56:30.48 Nov 26 18:56:30.520: INFO: Unexpected error: failed to list events in namespace "statefulset-1262": <*url.Error | 0xc003435a40>: { Op: "Get", URL: "https://34.83.88.61/api/v1/namespaces/statefulset-1262/events", Err: <*net.OpError | 0xc0040a82d0>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc0028faae0>{ IP: [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 255, 255, 34, 83, 88, 61], Port: 443, Zone: "", }, Err: <*os.SyscallError | 0xc0040f3fe0>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Nov 26 18:56:30.520: FAIL: failed to list events in namespace "statefulset-1262": Get "https://34.83.88.61/api/v1/namespaces/statefulset-1262/events": dial tcp 34.83.88.61:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/framework/debug.dumpEventsInNamespace(0xc000f765c0, {0xc0022f0c80, 0x10}) test/e2e/framework/debug/dump.go:44 +0x191 k8s.io/kubernetes/test/e2e/framework/debug.DumpAllNamespaceInfo({0x801de88, 0xc004cb5a00}, {0xc0022f0c80, 0x10}) test/e2e/framework/debug/dump.go:62 +0x8d k8s.io/kubernetes/test/e2e/framework/debug/init.init.0.func1.1(0xc000f76650?, {0xc0022f0c80?, 0x7fa7740?}) test/e2e/framework/debug/init/init.go:34 +0x32 k8s.io/kubernetes/test/e2e/framework.(*Framework).dumpNamespaceInfo.func1() test/e2e/framework/framework.go:274 +0x6d k8s.io/kubernetes/test/e2e/framework.(*Framework).dumpNamespaceInfo(0xc0010c21e0) test/e2e/framework/framework.go:271 +0x179 reflect.Value.call({0x6627cc0?, 0xc00119a460?, 0xc00424cfb0?}, {0x75b6e72, 0x4}, {0xae73300, 0x0, 0xc00056e708?}) /usr/local/go/src/reflect/value.go:584 +0x8c5 reflect.Value.Call({0x6627cc0?, 0xc00119a460?, 0x29449fc?}, {0xae73300?, 0xc00424cf80?, 0x26225bd?}) /usr/local/go/src/reflect/value.go:368 +0xbc [DeferCleanup (Each)] [sig-apps] StatefulSet tear down framework | framework.go:193 STEP: Destroying namespace "statefulset-1262" for this suite. 11/26/22 18:56:30.521 Nov 26 18:56:30.560: FAIL: Couldn't delete ns: "statefulset-1262": Delete "https://34.83.88.61/api/v1/namespaces/statefulset-1262": dial tcp 34.83.88.61:443: connect: connection refused (&url.Error{Op:"Delete", URL:"https://34.83.88.61/api/v1/namespaces/statefulset-1262", Err:(*net.OpError)(0xc0034cc3c0)}) Full Stack Trace k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach.func1() test/e2e/framework/framework.go:370 +0x4fe k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach(0xc0010c21e0) test/e2e/framework/framework.go:383 +0x1ca reflect.Value.call({0x6627cc0?, 0xc00119a390?, 0xc0034a5fb0?}, {0x75b6e72, 0x4}, {0xae73300, 0x0, 0x0?}) /usr/local/go/src/reflect/value.go:584 +0x8c5 reflect.Value.Call({0x6627cc0?, 0xc00119a390?, 0x0?}, {0xae73300?, 0x5?, 0xc000f47d40?}) /usr/local/go/src/reflect/value.go:368 +0xbc
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-auth\]\sServiceAccounts\sshould\ssupport\sInClusterConfig\swith\stoken\srotation\s\[Slow\]$'
test/e2e/framework/framework.go:241 k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc0011be3c0) test/e2e/framework/framework.go:241 +0x96ffrom junit_01.xml
[BeforeEach] [sig-auth] ServiceAccounts set up framework | framework.go:178 STEP: Creating a kubernetes client 11/26/22 18:49:13.183 Nov 26 18:49:13.183: INFO: >>> kubeConfig: /workspace/.kube/config STEP: Building a namespace api object, basename svcaccounts 11/26/22 18:49:13.185 Nov 26 18:49:13.225: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:15.265: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:17.265: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:19.265: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:21.265: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:23.265: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:25.265: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:27.265: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:29.265: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:31.265: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:33.264: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:35.265: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:37.265: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:39.265: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:41.265: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:43.266: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:43.305: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:43.305: INFO: Unexpected error: <*errors.errorString | 0xc000113cb0>: { s: "timed out waiting for the condition", } Nov 26 18:49:43.305: FAIL: timed out waiting for the condition Full Stack Trace k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc0011be3c0) test/e2e/framework/framework.go:241 +0x96f [AfterEach] [sig-auth] ServiceAccounts test/e2e/framework/node/init/init.go:32 Nov 26 18:49:43.305: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready [DeferCleanup (Each)] [sig-auth] ServiceAccounts dump namespaces | framework.go:196 STEP: dump namespace information after failure 11/26/22 18:49:43.345 [DeferCleanup (Each)] [sig-auth] ServiceAccounts tear down framework | framework.go:193
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-cli\]\sKubectl\sclient\sSimple\spod\sshould\sreturn\scommand\sexit\scodes\s\[Slow\]\srunning\sa\sfailing\scommand\swith\s\-\-leave\-stdin\-open$'
test/e2e/framework/framework.go:241 k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc00140ed20) test/e2e/framework/framework.go:241 +0x96ffrom junit_01.xml
[BeforeEach] [sig-cli] Kubectl client set up framework | framework.go:178 STEP: Creating a kubernetes client 11/26/22 18:53:22.095 Nov 26 18:53:22.095: INFO: >>> kubeConfig: /workspace/.kube/config STEP: Building a namespace api object, basename kubectl 11/26/22 18:53:22.097 Nov 26 18:53:22.136: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:24.176: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:26.176: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:28.176: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:30.176: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:32.176: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:34.176: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:36.177: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:38.176: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:40.177: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:42.176: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:44.177: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:46.177: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:48.176: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:50.176: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:52.177: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:52.216: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:52.216: INFO: Unexpected error: <*errors.errorString | 0xc00017da30>: { s: "timed out waiting for the condition", } Nov 26 18:53:52.216: FAIL: timed out waiting for the condition Full Stack Trace k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc00140ed20) test/e2e/framework/framework.go:241 +0x96f [AfterEach] [sig-cli] Kubectl client test/e2e/framework/node/init/init.go:32 Nov 26 18:53:52.217: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready [DeferCleanup (Each)] [sig-cli] Kubectl client dump namespaces | framework.go:196 STEP: dump namespace information after failure 11/26/22 18:53:52.256 [DeferCleanup (Each)] [sig-cli] Kubectl client tear down framework | framework.go:193
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-cli\]\sKubectl\sclient\sSimple\spod\sshould\sreturn\scommand\sexit\scodes\s\[Slow\]\srunning\sa\sfailing\scommand\swithout\s\-\-restart\=Never$'
test/e2e/kubectl/kubectl.go:567 k8s.io/kubernetes/test/e2e/kubectl.glob..func1.8.7.5() test/e2e/kubectl/kubectl.go:567 +0x31efrom junit_01.xml
[BeforeEach] [sig-cli] Kubectl client set up framework | framework.go:178 STEP: Creating a kubernetes client 11/26/22 18:51:40.228 Nov 26 18:51:40.228: INFO: >>> kubeConfig: /workspace/.kube/config STEP: Building a namespace api object, basename kubectl 11/26/22 18:51:40.23 STEP: Waiting for a default service account to be provisioned in namespace 11/26/22 18:51:40.537 STEP: Waiting for kube-root-ca.crt to be provisioned in namespace 11/26/22 18:51:40.643 [BeforeEach] [sig-cli] Kubectl client test/e2e/framework/metrics/init/init.go:31 [BeforeEach] [sig-cli] Kubectl client test/e2e/kubectl/kubectl.go:274 [BeforeEach] Simple pod test/e2e/kubectl/kubectl.go:411 STEP: creating the pod from 11/26/22 18:51:40.752 Nov 26 18:51:40.753: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=kubectl-8079 create -f -' Nov 26 18:51:41.505: INFO: stderr: "" Nov 26 18:51:41.505: INFO: stdout: "pod/httpd created\n" Nov 26 18:51:41.505: INFO: Waiting up to 5m0s for 1 pods to be running and ready: [httpd] Nov 26 18:51:41.505: INFO: Waiting up to 5m0s for pod "httpd" in namespace "kubectl-8079" to be "running and ready" Nov 26 18:51:41.578: INFO: Pod "httpd": Phase="Pending", Reason="", readiness=false. Elapsed: 72.863127ms Nov 26 18:51:41.578: INFO: Error evaluating pod condition running and ready: want pod 'httpd' on 'bootstrap-e2e-minion-group-gnb8' to be 'Running' but was 'Pending' Nov 26 18:51:43.711: INFO: Pod "httpd": Phase="Pending", Reason="", readiness=false. Elapsed: 2.206155052s Nov 26 18:51:43.711: INFO: Error evaluating pod condition running and ready: want pod 'httpd' on 'bootstrap-e2e-minion-group-gnb8' to be 'Running' but was 'Pending' Nov 26 18:51:45.721: INFO: Pod "httpd": Phase="Pending", Reason="", readiness=false. Elapsed: 4.215916398s Nov 26 18:51:45.721: INFO: Error evaluating pod condition running and ready: want pod 'httpd' on 'bootstrap-e2e-minion-group-gnb8' to be 'Running' but was 'Pending' Nov 26 18:51:47.662: INFO: Pod "httpd": Phase="Pending", Reason="", readiness=false. Elapsed: 6.156978877s Nov 26 18:51:47.662: INFO: Error evaluating pod condition running and ready: want pod 'httpd' on 'bootstrap-e2e-minion-group-gnb8' to be 'Running' but was 'Pending' Nov 26 18:51:49.642: INFO: Pod "httpd": Phase="Pending", Reason="", readiness=false. Elapsed: 8.137155686s Nov 26 18:51:49.642: INFO: Error evaluating pod condition running and ready: want pod 'httpd' on 'bootstrap-e2e-minion-group-gnb8' to be 'Running' but was 'Pending' Nov 26 18:51:51.681: INFO: Pod "httpd": Phase="Pending", Reason="", readiness=false. Elapsed: 10.175818967s Nov 26 18:51:51.681: INFO: Error evaluating pod condition running and ready: want pod 'httpd' on 'bootstrap-e2e-minion-group-gnb8' to be 'Running' but was 'Pending' Nov 26 18:51:53.664: INFO: Pod "httpd": Phase="Pending", Reason="", readiness=false. Elapsed: 12.159085399s Nov 26 18:51:53.664: INFO: Error evaluating pod condition running and ready: want pod 'httpd' on 'bootstrap-e2e-minion-group-gnb8' to be 'Running' but was 'Pending' Nov 26 18:51:55.620: INFO: Pod "httpd": Phase="Pending", Reason="", readiness=false. Elapsed: 14.114757647s Nov 26 18:51:55.620: INFO: Error evaluating pod condition running and ready: want pod 'httpd' on 'bootstrap-e2e-minion-group-gnb8' to be 'Running' but was 'Pending' Nov 26 18:51:57.623: INFO: Pod "httpd": Phase="Running", Reason="", readiness=true. Elapsed: 16.11769865s Nov 26 18:51:57.623: INFO: Pod "httpd" satisfied condition "running and ready" Nov 26 18:51:57.623: INFO: Wanted all 1 pods to be running and ready. Result: true. Pods: [httpd] [It] [Slow] running a failing command without --restart=Never test/e2e/kubectl/kubectl.go:558 Nov 26 18:51:57.623: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=kubectl-8079 run -i --image=registry.k8s.io/e2e-test-images/busybox:1.29-4 --restart=OnFailure --pod-running-timeout=2m0s failure-2 -- /bin/sh -c cat && exit 42' Nov 26 18:52:08.600: INFO: rc: 1 Nov 26 18:52:08.600: FAIL: Missing expected 'timed out' error, got: exec.CodeExitError{Err:(*errors.errorString)(0xc00165a8a0), Code:1} Full Stack Trace k8s.io/kubernetes/test/e2e/kubectl.glob..func1.8.7.5() test/e2e/kubectl/kubectl.go:567 +0x31e [AfterEach] Simple pod test/e2e/kubectl/kubectl.go:417 STEP: using delete to clean up resources 11/26/22 18:52:08.601 Nov 26 18:52:08.601: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=kubectl-8079 delete --grace-period=0 --force -f -' Nov 26 18:52:08.947: INFO: stderr: "Warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.\n" Nov 26 18:52:08.947: INFO: stdout: "pod \"httpd\" force deleted\n" Nov 26 18:52:08.947: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=kubectl-8079 get rc,svc -l name=httpd --no-headers' Nov 26 18:52:09.485: INFO: stderr: "No resources found in kubectl-8079 namespace.\n" Nov 26 18:52:09.485: INFO: stdout: "" Nov 26 18:52:09.485: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=kubectl-8079 get pods -l name=httpd -o go-template={{ range .items }}{{ if not .metadata.deletionTimestamp }}{{ .metadata.name }}{{ "\n" }}{{ end }}{{ end }}' Nov 26 18:52:09.800: INFO: stderr: "" Nov 26 18:52:09.800: INFO: stdout: "" [AfterEach] [sig-cli] Kubectl client test/e2e/framework/node/init/init.go:32 Nov 26 18:52:09.800: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready [DeferCleanup (Each)] [sig-cli] Kubectl client test/e2e/framework/metrics/init/init.go:33 [DeferCleanup (Each)] [sig-cli] Kubectl client dump namespaces | framework.go:196 STEP: dump namespace information after failure 11/26/22 18:52:09.886 STEP: Collecting events from namespace "kubectl-8079". 11/26/22 18:52:09.886 STEP: Found 11 events. 11/26/22 18:52:09.957 Nov 26 18:52:09.957: INFO: At 2022-11-26 18:51:41 +0000 UTC - event for httpd: {default-scheduler } Scheduled: Successfully assigned kubectl-8079/httpd to bootstrap-e2e-minion-group-gnb8 Nov 26 18:52:09.957: INFO: At 2022-11-26 18:51:43 +0000 UTC - event for httpd: {kubelet bootstrap-e2e-minion-group-gnb8} Pulled: Container image "registry.k8s.io/e2e-test-images/httpd:2.4.38-4" already present on machine Nov 26 18:52:09.957: INFO: At 2022-11-26 18:51:43 +0000 UTC - event for httpd: {kubelet bootstrap-e2e-minion-group-gnb8} Created: Created container httpd Nov 26 18:52:09.957: INFO: At 2022-11-26 18:51:44 +0000 UTC - event for httpd: {kubelet bootstrap-e2e-minion-group-gnb8} Started: Started container httpd Nov 26 18:52:09.957: INFO: At 2022-11-26 18:51:52 +0000 UTC - event for httpd: {kubelet bootstrap-e2e-minion-group-gnb8} Killing: Stopping container httpd Nov 26 18:52:09.957: INFO: At 2022-11-26 18:51:53 +0000 UTC - event for httpd: {kubelet bootstrap-e2e-minion-group-gnb8} SandboxChanged: Pod sandbox changed, it will be killed and re-created. Nov 26 18:52:09.957: INFO: At 2022-11-26 18:51:57 +0000 UTC - event for failure-2: {default-scheduler } Scheduled: Successfully assigned kubectl-8079/failure-2 to bootstrap-e2e-minion-group-gnb8 Nov 26 18:52:09.957: INFO: At 2022-11-26 18:51:59 +0000 UTC - event for failure-2: {kubelet bootstrap-e2e-minion-group-gnb8} Pulled: Container image "registry.k8s.io/e2e-test-images/busybox:1.29-4" already present on machine Nov 26 18:52:09.957: INFO: At 2022-11-26 18:51:59 +0000 UTC - event for failure-2: {kubelet bootstrap-e2e-minion-group-gnb8} Created: Created container failure-2 Nov 26 18:52:09.957: INFO: At 2022-11-26 18:51:59 +0000 UTC - event for failure-2: {kubelet bootstrap-e2e-minion-group-gnb8} Started: Started container failure-2 Nov 26 18:52:09.957: INFO: At 2022-11-26 18:52:00 +0000 UTC - event for failure-2: {kubelet bootstrap-e2e-minion-group-gnb8} Killing: Stopping container failure-2 Nov 26 18:52:10.035: INFO: POD NODE PHASE GRACE CONDITIONS Nov 26 18:52:10.035: INFO: failure-2 bootstrap-e2e-minion-group-gnb8 Running [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:51:57 +0000 UTC } {Ready True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:51:59 +0000 UTC } {ContainersReady True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:51:59 +0000 UTC } {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:51:57 +0000 UTC }] Nov 26 18:52:10.035: INFO: Nov 26 18:52:10.147: INFO: Unable to fetch kubectl-8079/failure-2/failure-2 logs: an error on the server ("unknown") has prevented the request from succeeding (get pods failure-2) Nov 26 18:52:10.220: INFO: Logging node info for node bootstrap-e2e-master Nov 26 18:52:10.280: INFO: Node Info: &Node{ObjectMeta:{bootstrap-e2e-master 3b91a491-10ed-470d-8d9a-7e47529f6987 6909 0 2022-11-26 18:39:29 +0000 UTC <nil> <nil> map[beta.kubernetes.io/arch:amd64 beta.kubernetes.io/instance-type:n1-standard-1 beta.kubernetes.io/os:linux cloud.google.com/metadata-proxy-ready:true failure-domain.beta.kubernetes.io/region:us-west1 failure-domain.beta.kubernetes.io/zone:us-west1-b kubernetes.io/arch:amd64 kubernetes.io/hostname:bootstrap-e2e-master kubernetes.io/os:linux node.kubernetes.io/instance-type:n1-standard-1 topology.kubernetes.io/region:us-west1 topology.kubernetes.io/zone:us-west1-b] map[node.alpha.kubernetes.io/ttl:0 volumes.kubernetes.io/controller-managed-attach-detach:true] [] [] [{kubelet Update v1 2022-11-26 18:39:29 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-attach-detach":{}},"f:labels":{".":{},"f:beta.kubernetes.io/arch":{},"f:beta.kubernetes.io/instance-type":{},"f:beta.kubernetes.io/os":{},"f:cloud.google.com/metadata-proxy-ready":{},"f:failure-domain.beta.kubernetes.io/region":{},"f:failure-domain.beta.kubernetes.io/zone":{},"f:kubernetes.io/arch":{},"f:kubernetes.io/hostname":{},"f:kubernetes.io/os":{},"f:node.kubernetes.io/instance-type":{},"f:topology.kubernetes.io/region":{},"f:topology.kubernetes.io/zone":{}}},"f:spec":{"f:providerID":{},"f:unschedulable":{}}} } {kube-controller-manager Update v1 2022-11-26 18:39:46 +0000 UTC FieldsV1 {"f:status":{"f:conditions":{"k:{\"type\":\"NetworkUnavailable\"}":{"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{}}}}} status} {kube-controller-manager Update v1 2022-11-26 18:39:49 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}},"f:spec":{"f:podCIDR":{},"f:podCIDRs":{".":{},"v:\"10.64.0.0/24\"":{}},"f:taints":{}}} } {kubelet Update v1 2022-11-26 18:50:55 +0000 UTC FieldsV1 {"f:status":{"f:conditions":{"k:{\"type\":\"DiskPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"MemoryPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"PIDPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"Ready\"}":{"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{}}},"f:images":{}}} status}]},Spec:NodeSpec{PodCIDR:10.64.0.0/24,DoNotUseExternalID:,ProviderID:gce://k8s-jkns-e2e-gke-ubuntu-slow/us-west1-b/bootstrap-e2e-master,Unschedulable:true,Taints:[]Taint{Taint{Key:node-role.kubernetes.io/master,Value:,Effect:NoSchedule,TimeAdded:<nil>,},Taint{Key:node.kubernetes.io/unschedulable,Value:,Effect:NoSchedule,TimeAdded:<nil>,},},ConfigSource:nil,PodCIDRs:[10.64.0.0/24],},Status:NodeStatus{Capacity:ResourceList{attachable-volumes-gce-pd: {{127 0} {<nil>} 127 DecimalSI},cpu: {{1 0} {<nil>} 1 DecimalSI},ephemeral-storage: {{16656896000 0} {<nil>} 16266500Ki BinarySI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{3858374656 0} {<nil>} BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Allocatable:ResourceList{attachable-volumes-gce-pd: {{127 0} {<nil>} 127 DecimalSI},cpu: {{1 0} {<nil>} 1 DecimalSI},ephemeral-storage: {{14991206376 0} {<nil>} 14991206376 DecimalSI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{3596230656 0} {<nil>} BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Phase:,Conditions:[]NodeCondition{NodeCondition{Type:NetworkUnavailable,Status:False,LastHeartbeatTime:2022-11-26 18:39:46 +0000 UTC,LastTransitionTime:2022-11-26 18:39:46 +0000 UTC,Reason:RouteCreated,Message:RouteController created a route,},NodeCondition{Type:MemoryPressure,Status:False,LastHeartbeatTime:2022-11-26 18:50:55 +0000 UTC,LastTransitionTime:2022-11-26 18:39:29 +0000 UTC,Reason:KubeletHasSufficientMemory,Message:kubelet has sufficient memory available,},NodeCondition{Type:DiskPressure,Status:False,LastHeartbeatTime:2022-11-26 18:50:55 +0000 UTC,LastTransitionTime:2022-11-26 18:39:29 +0000 UTC,Reason:KubeletHasNoDiskPressure,Message:kubelet has no disk pressure,},NodeCondition{Type:PIDPressure,Status:False,LastHeartbeatTime:2022-11-26 18:50:55 +0000 UTC,LastTransitionTime:2022-11-26 18:39:29 +0000 UTC,Reason:KubeletHasSufficientPID,Message:kubelet has sufficient PID available,},NodeCondition{Type:Ready,Status:True,LastHeartbeatTime:2022-11-26 18:50:55 +0000 UTC,LastTransitionTime:2022-11-26 18:39:48 +0000 UTC,Reason:KubeletReady,Message:kubelet is posting ready status. AppArmor enabled,},},Addresses:[]NodeAddress{NodeAddress{Type:InternalIP,Address:10.138.0.2,},NodeAddress{Type:ExternalIP,Address:34.83.88.61,},NodeAddress{Type:InternalDNS,Address:bootstrap-e2e-master.c.k8s-jkns-e2e-gke-ubuntu-slow.internal,},NodeAddress{Type:Hostname,Address:bootstrap-e2e-master.c.k8s-jkns-e2e-gke-ubuntu-slow.internal,},},DaemonEndpoints:NodeDaemonEndpoints{KubeletEndpoint:DaemonEndpoint{Port:10250,},},NodeInfo:NodeSystemInfo{MachineID:3cf1b58d8a4ce7f42c27588b99e07e59,SystemUUID:3cf1b58d-8a4c-e7f4-2c27-588b99e07e59,BootID:5ae5a72a-a603-4ca4-8f28-2f5024cd1d88,KernelVersion:5.10.123+,OSImage:Container-Optimized OS from Google,ContainerRuntimeVersion:containerd://1.7.0-beta.0-149-gd06318622,KubeletVersion:v1.27.0-alpha.0.50+70617042976dc1,KubeProxyVersion:v1.27.0-alpha.0.50+70617042976dc1,OperatingSystem:linux,Architecture:amd64,},Images:[]ContainerImage{ContainerImage{Names:[registry.k8s.io/kube-apiserver-amd64:v1.27.0-alpha.0.50_70617042976dc1],SizeBytes:135160272,},ContainerImage{Names:[registry.k8s.io/kube-controller-manager-amd64:v1.27.0-alpha.0.50_70617042976dc1],SizeBytes:124990265,},ContainerImage{Names:[registry.k8s.io/etcd@sha256:dd75ec974b0a2a6f6bb47001ba09207976e625db898d1b16735528c009cb171c registry.k8s.io/etcd:3.5.6-0],SizeBytes:102542580,},ContainerImage{Names:[registry.k8s.io/kube-scheduler-amd64:v1.27.0-alpha.0.50_70617042976dc1],SizeBytes:57660216,},ContainerImage{Names:[gke.gcr.io/prometheus-to-sd@sha256:e739643c3939ba0b161425f45a1989eedfc4a3b166db9a7100863296b4c70510 gke.gcr.io/prometheus-to-sd:v0.11.1-gke.1],SizeBytes:48742566,},ContainerImage{Names:[gcr.io/k8s-ingress-image-push/ingress-gce-glbc-amd64@sha256:5db27383add6d9f4ebdf0286409ac31f7f5d273690204b341a4e37998917693b gcr.io/k8s-ingress-image-push/ingress-gce-glbc-amd64:v1.20.1],SizeBytes:36598135,},ContainerImage{Names:[registry.k8s.io/addon-manager/kube-addon-manager@sha256:49cc4e6e4a3745b427ce14b0141476ab339bb65c6bc05033019e046c8727dcb0 registry.k8s.io/addon-manager/kube-addon-manager:v9.1.6],SizeBytes:30464183,},ContainerImage{Names:[registry.k8s.io/kas-network-proxy/proxy-server@sha256:2c111f004bec24888d8cfa2a812a38fb8341350abac67dcd0ac64e709dfe389c registry.k8s.io/kas-network-proxy/proxy-server:v0.0.33],SizeBytes:22020129,},ContainerImage{Names:[registry.k8s.io/metadata-proxy@sha256:e914645f22e946bce5165737e1b244e0a296ad1f0f81a9531adc57af2780978a registry.k8s.io/metadata-proxy:v0.1.12],SizeBytes:5301657,},ContainerImage{Names:[registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d registry.k8s.io/pause:3.8],SizeBytes:311286,},},VolumesInUse:[],VolumesAttached:[]AttachedVolume{},Config:nil,},} Nov 26 18:52:10.280: INFO: Logging kubelet events for node bootstrap-e2e-master Nov 26 18:52:10.493: INFO: Logging pods the kubelet thinks is on node bootstrap-e2e-master Nov 26 18:52:10.653: INFO: Unable to retrieve kubelet pods for node bootstrap-e2e-master: error trying to reach service: No agent available Nov 26 18:52:10.653: INFO: Logging node info for node bootstrap-e2e-minion-group-dzls Nov 26 18:52:10.713: INFO: Node Info: &Node{ObjectMeta:{bootstrap-e2e-minion-group-dzls 073571af-ef8e-4d0a-9070-5398dab26550 8161 0 2022-11-26 18:39:33 +0000 UTC <nil> <nil> map[beta.kubernetes.io/arch:amd64 beta.kubernetes.io/instance-type:n1-standard-2 beta.kubernetes.io/os:linux cloud.google.com/metadata-proxy-ready:true failure-domain.beta.kubernetes.io/region:us-west1 failure-domain.beta.kubernetes.io/zone:us-west1-b kubernetes.io/arch:amd64 kubernetes.io/hostname:bootstrap-e2e-minion-group-dzls kubernetes.io/os:linux node.kubernetes.io/instance-type:n1-standard-2 topology.hostpath.csi/node:bootstrap-e2e-minion-group-dzls topology.kubernetes.io/region:us-west1 topology.kubernetes.io/zone:us-west1-b] map[csi.volume.kubernetes.io/nodeid:{"csi-hostpath-provisioning-1062":"bootstrap-e2e-minion-group-dzls","csi-mock-csi-mock-volumes-2237":"bootstrap-e2e-minion-group-dzls","csi-mock-csi-mock-volumes-5240":"bootstrap-e2e-minion-group-dzls","csi-mock-csi-mock-volumes-7257":"bootstrap-e2e-minion-group-dzls"} node.alpha.kubernetes.io/ttl:0 volumes.kubernetes.io/controller-managed-attach-detach:true] [] [] [{kube-controller-manager Update v1 2022-11-26 18:39:33 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}},"f:spec":{"f:podCIDR":{},"f:podCIDRs":{".":{},"v:\"10.64.3.0/24\"":{}}}} } {kubelet Update v1 2022-11-26 18:39:33 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-attach-detach":{}},"f:labels":{".":{},"f:beta.kubernetes.io/arch":{},"f:beta.kubernetes.io/instance-type":{},"f:beta.kubernetes.io/os":{},"f:cloud.google.com/metadata-proxy-ready":{},"f:failure-domain.beta.kubernetes.io/region":{},"f:failure-domain.beta.kubernetes.io/zone":{},"f:kubernetes.io/arch":{},"f:kubernetes.io/hostname":{},"f:kubernetes.io/os":{},"f:node.kubernetes.io/instance-type":{},"f:topology.kubernetes.io/region":{},"f:topology.kubernetes.io/zone":{}}},"f:spec":{"f:providerID":{}}} } {node-problem-detector Update v1 2022-11-26 18:50:21 +0000 UTC FieldsV1 {"f:status":{"f:conditions":{"k:{\"type\":\"CorruptDockerOverlay2\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentContainerdRestart\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentDockerRestart\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentKubeletRestart\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentUnregisterNetDevice\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"KernelDeadlock\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"ReadonlyFilesystem\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}}}}} status} {kube-controller-manager Update v1 2022-11-26 18:51:31 +0000 UTC FieldsV1 {"f:status":{"f:conditions":{"k:{\"type\":\"NetworkUnavailable\"}":{"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{}}}}} status} {kubelet Update v1 2022-11-26 18:52:09 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:csi.volume.kubernetes.io/nodeid":{}},"f:labels":{"f:topology.hostpath.csi/node":{}}},"f:status":{"f:conditions":{"k:{\"type\":\"DiskPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"MemoryPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"PIDPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"Ready\"}":{"f:lastHeartbeatTime":{},"f:message":{},"f:reason":{},"f:status":{}}},"f:images":{}}} status}]},Spec:NodeSpec{PodCIDR:10.64.3.0/24,DoNotUseExternalID:,ProviderID:gce://k8s-jkns-e2e-gke-ubuntu-slow/us-west1-b/bootstrap-e2e-minion-group-dzls,Unschedulable:false,Taints:[]Taint{},ConfigSource:nil,PodCIDRs:[10.64.3.0/24],},Status:NodeStatus{Capacity:ResourceList{attachable-volumes-gce-pd: {{127 0} {<nil>} 127 DecimalSI},cpu: {{2 0} {<nil>} 2 DecimalSI},ephemeral-storage: {{101203873792 0} {<nil>} 98831908Ki BinarySI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{7815438336 0} {<nil>} BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Allocatable:ResourceList{attachable-volumes-gce-pd: {{127 0} {<nil>} 127 DecimalSI},cpu: {{2 0} {<nil>} 2 DecimalSI},ephemeral-storage: {{91083486262 0} {<nil>} 91083486262 DecimalSI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{7553294336 0} {<nil>} BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Phase:,Conditions:[]NodeCondition{NodeCondition{Type:ReadonlyFilesystem,Status:False,LastHeartbeatTime:2022-11-26 18:50:21 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:FilesystemIsNotReadOnly,Message:Filesystem is not read-only,},NodeCondition{Type:CorruptDockerOverlay2,Status:False,LastHeartbeatTime:2022-11-26 18:50:21 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoCorruptDockerOverlay2,Message:docker overlay2 is functioning properly,},NodeCondition{Type:FrequentUnregisterNetDevice,Status:False,LastHeartbeatTime:2022-11-26 18:50:21 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentUnregisterNetDevice,Message:node is functioning properly,},NodeCondition{Type:FrequentKubeletRestart,Status:False,LastHeartbeatTime:2022-11-26 18:50:21 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentKubeletRestart,Message:kubelet is functioning properly,},NodeCondition{Type:FrequentDockerRestart,Status:False,LastHeartbeatTime:2022-11-26 18:50:21 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentDockerRestart,Message:docker is functioning properly,},NodeCondition{Type:FrequentContainerdRestart,Status:False,LastHeartbeatTime:2022-11-26 18:50:21 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentContainerdRestart,Message:containerd is functioning properly,},NodeCondition{Type:KernelDeadlock,Status:False,LastHeartbeatTime:2022-11-26 18:50:21 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:KernelHasNoDeadlock,Message:kernel has no deadlock,},NodeCondition{Type:NetworkUnavailable,Status:False,LastHeartbeatTime:2022-11-26 18:39:46 +0000 UTC,LastTransitionTime:2022-11-26 18:39:46 +0000 UTC,Reason:RouteCreated,Message:RouteController created a route,},NodeCondition{Type:MemoryPressure,Status:False,LastHeartbeatTime:2022-11-26 18:52:09 +0000 UTC,LastTransitionTime:2022-11-26 18:39:33 +0000 UTC,Reason:KubeletHasSufficientMemory,Message:kubelet has sufficient memory available,},NodeCondition{Type:DiskPressure,Status:False,LastHeartbeatTime:2022-11-26 18:52:09 +0000 UTC,LastTransitionTime:2022-11-26 18:39:33 +0000 UTC,Reason:KubeletHasNoDiskPressure,Message:kubelet has no disk pressure,},NodeCondition{Type:PIDPressure,Status:False,LastHeartbeatTime:2022-11-26 18:52:09 +0000 UTC,LastTransitionTime:2022-11-26 18:39:33 +0000 UTC,Reason:KubeletHasSufficientPID,Message:kubelet has sufficient PID available,},NodeCondition{Type:Ready,Status:True,LastHeartbeatTime:2022-11-26 18:52:09 +0000 UTC,LastTransitionTime:2022-11-26 18:39:33 +0000 UTC,Reason:KubeletReady,Message:kubelet is posting ready status. AppArmor enabled,},},Addresses:[]NodeAddress{NodeAddress{Type:InternalIP,Address:10.138.0.5,},NodeAddress{Type:ExternalIP,Address:34.168.240.199,},NodeAddress{Type:InternalDNS,Address:bootstrap-e2e-minion-group-dzls.c.k8s-jkns-e2e-gke-ubuntu-slow.internal,},NodeAddress{Type:Hostname,Address:bootstrap-e2e-minion-group-dzls.c.k8s-jkns-e2e-gke-ubuntu-slow.internal,},},DaemonEndpoints:NodeDaemonEndpoints{KubeletEndpoint:DaemonEndpoint{Port:10250,},},NodeInfo:NodeSystemInfo{MachineID:8acf528984519ae32bb25015f71e5ef3,SystemUUID:8acf5289-8451-9ae3-2bb2-5015f71e5ef3,BootID:5e47f3ec-cd63-4220-9ac4-8feb7752a952,KernelVersion:5.10.123+,OSImage:Container-Optimized OS from Google,ContainerRuntimeVersion:containerd://1.7.0-beta.0-149-gd06318622,KubeletVersion:v1.27.0-alpha.0.50+70617042976dc1,KubeProxyVersion:v1.27.0-alpha.0.50+70617042976dc1,OperatingSystem:linux,Architecture:amd64,},Images:[]ContainerImage{ContainerImage{Names:[registry.k8s.io/e2e-test-images/jessie-dnsutils@sha256:24aaf2626d6b27864c29de2097e8bbb840b3a414271bf7c8995e431e47d8408e registry.k8s.io/e2e-test-images/jessie-dnsutils:1.7],SizeBytes:112030336,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/volume/nfs@sha256:3bda73f2428522b0e342af80a0b9679e8594c2126f2b3cca39ed787589741b9e registry.k8s.io/e2e-test-images/volume/nfs:1.3],SizeBytes:95836203,},ContainerImage{Names:[registry.k8s.io/sig-storage/nfs-provisioner@sha256:e943bb77c7df05ebdc8c7888b2db289b13bf9f012d6a3a5a74f14d4d5743d439 registry.k8s.io/sig-storage/nfs-provisioner:v3.0.1],SizeBytes:90632047,},ContainerImage{Names:[registry.k8s.io/kube-proxy-amd64:v1.27.0-alpha.0.50_70617042976dc1],SizeBytes:67201736,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/agnhost@sha256:16bbf38c463a4223d8cfe4da12bc61010b082a79b4bb003e2d3ba3ece5dd5f9e registry.k8s.io/e2e-test-images/agnhost:2.43],SizeBytes:51706353,},ContainerImage{Names:[gke.gcr.io/prometheus-to-sd@sha256:e739643c3939ba0b161425f45a1989eedfc4a3b166db9a7100863296b4c70510 gke.gcr.io/prometheus-to-sd:v0.11.1-gke.1],SizeBytes:48742566,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/httpd@sha256:148b022f5c5da426fc2f3c14b5c0867e58ef05961510c84749ac1fddcb0fef22 registry.k8s.io/e2e-test-images/httpd:2.4.38-4],SizeBytes:40764257,},ContainerImage{Names:[registry.k8s.io/metrics-server/metrics-server@sha256:6385aec64bb97040a5e692947107b81e178555c7a5b71caa90d733e4130efc10 registry.k8s.io/metrics-server/metrics-server:v0.5.2],SizeBytes:26023008,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-provisioner@sha256:ee3b525d5b89db99da3b8eb521d9cd90cb6e9ef0fbb651e98bb37be78d36b5b8 registry.k8s.io/sig-storage/csi-provisioner:v3.3.0],SizeBytes:25491225,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-resizer@sha256:425d8f1b769398127767b06ed97ce62578a3179bcb99809ce93a1649e025ffe7 registry.k8s.io/sig-storage/csi-resizer:v1.6.0],SizeBytes:24148884,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-snapshotter@sha256:291334908ddf71a4661fd7f6d9d97274de8a5378a2b6fdfeb2ce73414a34f82f registry.k8s.io/sig-storage/csi-snapshotter:v6.1.0],SizeBytes:23881995,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-attacher@sha256:9a685020911e2725ad019dbce6e4a5ab93d51e3d4557f115e64343345e05781b registry.k8s.io/sig-storage/csi-attacher:v4.0.0],SizeBytes:23847201,},ContainerImage{Names:[registry.k8s.io/sig-storage/snapshot-controller@sha256:823c75d0c45d1427f6d850070956d9ca657140a7bbf828381541d1d808475280 registry.k8s.io/sig-storage/snapshot-controller:v6.1.0],SizeBytes:22620891,},ContainerImage{Names:[registry.k8s.io/sig-storage/hostpathplugin@sha256:92257881c1d6493cf18299a24af42330f891166560047902b8d431fb66b01af5 registry.k8s.io/sig-storage/hostpathplugin:v1.9.0],SizeBytes:18758628,},ContainerImage{Names:[registry.k8s.io/cpa/cluster-proportional-autoscaler@sha256:fd636b33485c7826fb20ef0688a83ee0910317dbb6c0c6f3ad14661c1db25def registry.k8s.io/cpa/cluster-proportional-autoscaler:1.8.4],SizeBytes:15209393,},ContainerImage{Names:[registry.k8s.io/coredns/coredns@sha256:8e352a029d304ca7431c6507b56800636c321cb52289686a581ab70aaa8a2e2a registry.k8s.io/coredns/coredns:v1.9.3],SizeBytes:14837849,},ContainerImage{Names:[registry.k8s.io/autoscaling/addon-resizer@sha256:43f129b81d28f0fdd54de6d8e7eacd5728030782e03db16087fc241ad747d3d6 registry.k8s.io/autoscaling/addon-resizer:1.8.14],SizeBytes:10153852,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-node-driver-registrar@sha256:0103eee7c35e3e0b5cd8cdca9850dc71c793cdeb6669d8be7a89440da2d06ae4 registry.k8s.io/sig-storage/csi-node-driver-registrar:v2.5.1],SizeBytes:9133109,},ContainerImage{Names:[registry.k8s.io/sig-storage/livenessprobe@sha256:933940f13b3ea0abc62e656c1aa5c5b47c04b15d71250413a6b821bd0c58b94e registry.k8s.io/sig-storage/livenessprobe:v2.7.0],SizeBytes:8688564,},ContainerImage{Names:[registry.k8s.io/kas-network-proxy/proxy-agent@sha256:48f2a4ec3e10553a81b8dd1c6fa5fe4bcc9617f78e71c1ca89c6921335e2d7da registry.k8s.io/kas-network-proxy/proxy-agent:v0.0.33],SizeBytes:8512162,},ContainerImage{Names:[registry.k8s.io/networking/ingress-gce-404-server-with-metrics-amd64@sha256:7eb7b3cee4d33c10c49893ad3c386232b86d4067de5251294d4c620d6e072b93 registry.k8s.io/networking/ingress-gce-404-server-with-metrics-amd64:v1.10.11],SizeBytes:6463068,},ContainerImage{Names:[registry.k8s.io/metadata-proxy@sha256:e914645f22e946bce5165737e1b244e0a296ad1f0f81a9531adc57af2780978a registry.k8s.io/metadata-proxy:v0.1.12],SizeBytes:5301657,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/busybox@sha256:c318242786b139d18676b1c09a0ad7f15fc17f8f16a5b2e625cd0dc8c9703daf registry.k8s.io/e2e-test-images/busybox:1.29-2],SizeBytes:732424,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/busybox@sha256:2e0f836850e09b8b7cc937681d6194537a09fbd5f6b9e08f4d646a85128e8937 registry.k8s.io/e2e-test-images/busybox:1.29-4],SizeBytes:731990,},ContainerImage{Names:[registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097 registry.k8s.io/pause:3.9],SizeBytes:321520,},ContainerImage{Names:[registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d registry.k8s.io/pause:3.8],SizeBytes:311286,},},VolumesInUse:[],VolumesAttached:[]AttachedVolume{},Config:nil,},} Nov 26 18:52:10.713: INFO: Logging kubelet events for node bootstrap-e2e-minion-group-dzls Nov 26 18:52:10.775: INFO: Logging pods the kubelet thinks is on node bootstrap-e2e-minion-group-dzls Nov 26 18:52:10.856: INFO: Unable to retrieve kubelet pods for node bootstrap-e2e-minion-group-dzls: error trying to reach service: No agent available Nov 26 18:52:10.856: INFO: Logging node info for node bootstrap-e2e-minion-group-gnb8 Nov 26 18:52:10.914: INFO: Node Info: &Node{ObjectMeta:{bootstrap-e2e-minion-group-gnb8 d034f9b4-7cc6-4262-a4af-02040c7b7abd 7699 0 2022-11-26 18:39:33 +0000 UTC <nil> <nil> map[beta.kubernetes.io/arch:amd64 beta.kubernetes.io/instance-type:n1-standard-2 beta.kubernetes.io/os:linux cloud.google.com/metadata-proxy-ready:true failure-domain.beta.kubernetes.io/region:us-west1 failure-domain.beta.kubernetes.io/zone:us-west1-b kubernetes.io/arch:amd64 kubernetes.io/hostname:bootstrap-e2e-minion-group-gnb8 kubernetes.io/os:linux node.kubernetes.io/instance-type:n1-standard-2 topology.hostpath.csi/node:bootstrap-e2e-minion-group-gnb8 topology.kubernetes.io/region:us-west1 topology.kubernetes.io/zone:us-west1-b] map[csi.volume.kubernetes.io/nodeid:{"csi-hostpath-multivolume-2875":"bootstrap-e2e-minion-group-gnb8","csi-hostpath-provisioning-1566":"bootstrap-e2e-minion-group-gnb8"} node.alpha.kubernetes.io/ttl:0 volumes.kubernetes.io/controller-managed-attach-detach:true] [] [] [{kube-controller-manager Update v1 2022-11-26 18:39:33 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}},"f:spec":{"f:podCIDR":{},"f:podCIDRs":{".":{},"v:\"10.64.2.0/24\"":{}}}} } {kubelet Update v1 2022-11-26 18:39:33 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-attach-detach":{}},"f:labels":{".":{},"f:beta.kubernetes.io/arch":{},"f:beta.kubernetes.io/instance-type":{},"f:beta.kubernetes.io/os":{},"f:cloud.google.com/metadata-proxy-ready":{},"f:failure-domain.beta.kubernetes.io/region":{},"f:failure-domain.beta.kubernetes.io/zone":{},"f:kubernetes.io/arch":{},"f:kubernetes.io/hostname":{},"f:kubernetes.io/os":{},"f:node.kubernetes.io/instance-type":{},"f:topology.kubernetes.io/region":{},"f:topology.kubernetes.io/zone":{}}},"f:spec":{"f:providerID":{}}} } {node-problem-detector Update v1 2022-11-26 18:50:21 +0000 UTC FieldsV1 {"f:status":{"f:conditions":{"k:{\"type\":\"CorruptDockerOverlay2\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentContainerdRestart\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentDockerRestart\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentKubeletRestart\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentUnregisterNetDevice\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"KernelDeadlock\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"ReadonlyFilesystem\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}}}}} status} {kube-controller-manager Update v1 2022-11-26 18:51:29 +0000 UTC FieldsV1 {"f:status":{"f:conditions":{"k:{\"type\":\"NetworkUnavailable\"}":{"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{}}}}} status} {kubelet Update v1 2022-11-26 18:51:40 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:csi.volume.kubernetes.io/nodeid":{}},"f:labels":{"f:topology.hostpath.csi/node":{}}},"f:status":{"f:conditions":{"k:{\"type\":\"DiskPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"MemoryPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"PIDPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"Ready\"}":{"f:lastHeartbeatTime":{},"f:message":{},"f:reason":{},"f:status":{}}},"f:images":{}}} status}]},Spec:NodeSpec{PodCIDR:10.64.2.0/24,DoNotUseExternalID:,ProviderID:gce://k8s-jkns-e2e-gke-ubuntu-slow/us-west1-b/bootstrap-e2e-minion-group-gnb8,Unschedulable:false,Taints:[]Taint{},ConfigSource:nil,PodCIDRs:[10.64.2.0/24],},Status:NodeStatus{Capacity:ResourceList{attachable-volumes-gce-pd: {{127 0} {<nil>} 127 DecimalSI},cpu: {{2 0} {<nil>} 2 DecimalSI},ephemeral-storage: {{101203873792 0} {<nil>} 98831908Ki BinarySI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{7815438336 0} {<nil>} BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Allocatable:ResourceList{attachable-volumes-gce-pd: {{127 0} {<nil>} 127 DecimalSI},cpu: {{2 0} {<nil>} 2 DecimalSI},ephemeral-storage: {{91083486262 0} {<nil>} 91083486262 DecimalSI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{7553294336 0} {<nil>} BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Phase:,Conditions:[]NodeCondition{NodeCondition{Type:FrequentDockerRestart,Status:False,LastHeartbeatTime:2022-11-26 18:50:21 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentDockerRestart,Message:docker is functioning properly,},NodeCondition{Type:FrequentContainerdRestart,Status:False,LastHeartbeatTime:2022-11-26 18:50:21 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentContainerdRestart,Message:containerd is functioning properly,},NodeCondition{Type:CorruptDockerOverlay2,Status:False,LastHeartbeatTime:2022-11-26 18:50:21 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoCorruptDockerOverlay2,Message:docker overlay2 is functioning properly,},NodeCondition{Type:KernelDeadlock,Status:False,LastHeartbeatTime:2022-11-26 18:50:21 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:KernelHasNoDeadlock,Message:kernel has no deadlock,},NodeCondition{Type:ReadonlyFilesystem,Status:False,LastHeartbeatTime:2022-11-26 18:50:21 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:FilesystemIsNotReadOnly,Message:Filesystem is not read-only,},NodeCondition{Type:FrequentUnregisterNetDevice,Status:False,LastHeartbeatTime:2022-11-26 18:50:21 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentUnregisterNetDevice,Message:node is functioning properly,},NodeCondition{Type:FrequentKubeletRestart,Status:False,LastHeartbeatTime:2022-11-26 18:50:21 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentKubeletRestart,Message:kubelet is functioning properly,},NodeCondition{Type:NetworkUnavailable,Status:False,LastHeartbeatTime:2022-11-26 18:39:46 +0000 UTC,LastTransitionTime:2022-11-26 18:39:46 +0000 UTC,Reason:RouteCreated,Message:RouteController created a route,},NodeCondition{Type:MemoryPressure,Status:False,LastHeartbeatTime:2022-11-26 18:51:40 +0000 UTC,LastTransitionTime:2022-11-26 18:39:33 +0000 UTC,Reason:KubeletHasSufficientMemory,Message:kubelet has sufficient memory available,},NodeCondition{Type:DiskPressure,Status:False,LastHeartbeatTime:2022-11-26 18:51:40 +0000 UTC,LastTransitionTime:2022-11-26 18:39:33 +0000 UTC,Reason:KubeletHasNoDiskPressure,Message:kubelet has no disk pressure,},NodeCondition{Type:PIDPressure,Status:False,LastHeartbeatTime:2022-11-26 18:51:40 +0000 UTC,LastTransitionTime:2022-11-26 18:39:33 +0000 UTC,Reason:KubeletHasSufficientPID,Message:kubelet has sufficient PID available,},NodeCondition{Type:Ready,Status:True,LastHeartbeatTime:2022-11-26 18:51:40 +0000 UTC,LastTransitionTime:2022-11-26 18:39:33 +0000 UTC,Reason:KubeletReady,Message:kubelet is posting ready status. AppArmor enabled,},},Addresses:[]NodeAddress{NodeAddress{Type:InternalIP,Address:10.138.0.4,},NodeAddress{Type:ExternalIP,Address:34.83.41.63,},NodeAddress{Type:InternalDNS,Address:bootstrap-e2e-minion-group-gnb8.c.k8s-jkns-e2e-gke-ubuntu-slow.internal,},NodeAddress{Type:Hostname,Address:bootstrap-e2e-minion-group-gnb8.c.k8s-jkns-e2e-gke-ubuntu-slow.internal,},},DaemonEndpoints:NodeDaemonEndpoints{KubeletEndpoint:DaemonEndpoint{Port:10250,},},NodeInfo:NodeSystemInfo{MachineID:975025d51ee8dddd080415bf76b18e9f,SystemUUID:975025d5-1ee8-dddd-0804-15bf76b18e9f,BootID:78ba7c41-cb6b-4d0e-9530-01248f3478cf,KernelVersion:5.10.123+,OSImage:Container-Optimized OS from Google,ContainerRuntimeVersion:containerd://1.7.0-beta.0-149-gd06318622,KubeletVersion:v1.27.0-alpha.0.50+70617042976dc1,KubeProxyVersion:v1.27.0-alpha.0.50+70617042976dc1,OperatingSystem:linux,Architecture:amd64,},Images:[]ContainerImage{ContainerImage{Names:[registry.k8s.io/e2e-test-images/jessie-dnsutils@sha256:24aaf2626d6b27864c29de2097e8bbb840b3a414271bf7c8995e431e47d8408e registry.k8s.io/e2e-test-images/jessie-dnsutils:1.7],SizeBytes:112030336,},ContainerImage{Names:[registry.k8s.io/sig-storage/nfs-provisioner@sha256:e943bb77c7df05ebdc8c7888b2db289b13bf9f012d6a3a5a74f14d4d5743d439 registry.k8s.io/sig-storage/nfs-provisioner:v3.0.1],SizeBytes:90632047,},ContainerImage{Names:[registry.k8s.io/kube-proxy-amd64:v1.27.0-alpha.0.50_70617042976dc1],SizeBytes:67201736,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/agnhost@sha256:16bbf38c463a4223d8cfe4da12bc61010b082a79b4bb003e2d3ba3ece5dd5f9e registry.k8s.io/e2e-test-images/agnhost:2.43],SizeBytes:51706353,},ContainerImage{Names:[gke.gcr.io/prometheus-to-sd@sha256:e739643c3939ba0b161425f45a1989eedfc4a3b166db9a7100863296b4c70510 gke.gcr.io/prometheus-to-sd:v0.11.1-gke.1],SizeBytes:48742566,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/httpd@sha256:148b022f5c5da426fc2f3c14b5c0867e58ef05961510c84749ac1fddcb0fef22 registry.k8s.io/e2e-test-images/httpd:2.4.38-4],SizeBytes:40764257,},ContainerImage{Names:[registry.k8s.io/metrics-server/metrics-server@sha256:6385aec64bb97040a5e692947107b81e178555c7a5b71caa90d733e4130efc10 registry.k8s.io/metrics-server/metrics-server:v0.5.2],SizeBytes:26023008,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-provisioner@sha256:ee3b525d5b89db99da3b8eb521d9cd90cb6e9ef0fbb651e98bb37be78d36b5b8 registry.k8s.io/sig-storage/csi-provisioner:v3.3.0],SizeBytes:25491225,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-resizer@sha256:425d8f1b769398127767b06ed97ce62578a3179bcb99809ce93a1649e025ffe7 registry.k8s.io/sig-storage/csi-resizer:v1.6.0],SizeBytes:24148884,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-snapshotter@sha256:291334908ddf71a4661fd7f6d9d97274de8a5378a2b6fdfeb2ce73414a34f82f registry.k8s.io/sig-storage/csi-snapshotter:v6.1.0],SizeBytes:23881995,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-attacher@sha256:9a685020911e2725ad019dbce6e4a5ab93d51e3d4557f115e64343345e05781b registry.k8s.io/sig-storage/csi-attacher:v4.0.0],SizeBytes:23847201,},ContainerImage{Names:[registry.k8s.io/sig-storage/hostpathplugin@sha256:92257881c1d6493cf18299a24af42330f891166560047902b8d431fb66b01af5 registry.k8s.io/sig-storage/hostpathplugin:v1.9.0],SizeBytes:18758628,},ContainerImage{Names:[registry.k8s.io/autoscaling/addon-resizer@sha256:43f129b81d28f0fdd54de6d8e7eacd5728030782e03db16087fc241ad747d3d6 registry.k8s.io/autoscaling/addon-resizer:1.8.14],SizeBytes:10153852,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-node-driver-registrar@sha256:0103eee7c35e3e0b5cd8cdca9850dc71c793cdeb6669d8be7a89440da2d06ae4 registry.k8s.io/sig-storage/csi-node-driver-registrar:v2.5.1],SizeBytes:9133109,},ContainerImage{Names:[registry.k8s.io/sig-storage/livenessprobe@sha256:933940f13b3ea0abc62e656c1aa5c5b47c04b15d71250413a6b821bd0c58b94e registry.k8s.io/sig-storage/livenessprobe:v2.7.0],SizeBytes:8688564,},ContainerImage{Names:[registry.k8s.io/kas-network-proxy/proxy-agent@sha256:48f2a4ec3e10553a81b8dd1c6fa5fe4bcc9617f78e71c1ca89c6921335e2d7da registry.k8s.io/kas-network-proxy/proxy-agent:v0.0.33],SizeBytes:8512162,},ContainerImage{Names:[registry.k8s.io/metadata-proxy@sha256:e914645f22e946bce5165737e1b244e0a296ad1f0f81a9531adc57af2780978a registry.k8s.io/metadata-proxy:v0.1.12],SizeBytes:5301657,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/busybox@sha256:c318242786b139d18676b1c09a0ad7f15fc17f8f16a5b2e625cd0dc8c9703daf registry.k8s.io/e2e-test-images/busybox:1.29-2],SizeBytes:732424,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/busybox@sha256:2e0f836850e09b8b7cc937681d6194537a09fbd5f6b9e08f4d646a85128e8937 registry.k8s.io/e2e-test-images/busybox:1.29-4],SizeBytes:731990,},ContainerImage{Names:[registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097 registry.k8s.io/pause:3.9],SizeBytes:321520,},ContainerImage{Names:[registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d registry.k8s.io/pause:3.8],SizeBytes:311286,},},VolumesInUse:[],VolumesAttached:[]AttachedVolume{},Config:nil,},} Nov 26 18:52:10.914: INFO: Logging kubelet events for node bootstrap-e2e-minion-group-gnb8 Nov 26 18:52:10.983: INFO: Logging pods the kubelet thinks is on node bootstrap-e2e-minion-group-gnb8 Nov 26 18:52:11.075: INFO: Unable to retrieve kubelet pods for node bootstrap-e2e-minion-group-gnb8: error trying to reach service: No agent available Nov 26 18:52:11.075: INFO: Logging node info for node bootstrap-e2e-minion-group-p1wq Nov 26 18:52:11.144: INFO: Node Info: &Node{ObjectMeta:{bootstrap-e2e-minion-group-p1wq e4a1a2e4-c0df-40e4-9f3f-bc50b8a3f74f 8171 0 2022-11-26 18:39:32 +0000 UTC <nil> <nil> map[beta.kubernetes.io/arch:amd64 beta.kubernetes.io/instance-type:n1-standard-2 beta.kubernetes.io/os:linux cloud.google.com/metadata-proxy-ready:true failure-domain.beta.kubernetes.io/region:us-west1 failure-domain.beta.kubernetes.io/zone:us-west1-b kubernetes.io/arch:amd64 kubernetes.io/hostname:bootstrap-e2e-minion-group-p1wq kubernetes.io/os:linux node.kubernetes.io/instance-type:n1-standard-2 topology.hostpath.csi/node:bootstrap-e2e-minion-group-p1wq topology.kubernetes.io/region:us-west1 topology.kubernetes.io/zone:us-west1-b] map[csi.volume.kubernetes.io/nodeid:{"csi-hostpath-provisioning-4955":"bootstrap-e2e-minion-group-p1wq","csi-hostpath-provisioning-8805":"bootstrap-e2e-minion-group-p1wq","csi-mock-csi-mock-volumes-2715":"bootstrap-e2e-minion-group-p1wq"} node.alpha.kubernetes.io/ttl:0 volumes.kubernetes.io/controller-managed-attach-detach:true] [] [] [{kubelet Update v1 2022-11-26 18:39:32 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-attach-detach":{}},"f:labels":{".":{},"f:beta.kubernetes.io/arch":{},"f:beta.kubernetes.io/instance-type":{},"f:beta.kubernetes.io/os":{},"f:cloud.google.com/metadata-proxy-ready":{},"f:failure-domain.beta.kubernetes.io/region":{},"f:failure-domain.beta.kubernetes.io/zone":{},"f:kubernetes.io/arch":{},"f:kubernetes.io/hostname":{},"f:kubernetes.io/os":{},"f:node.kubernetes.io/instance-type":{},"f:topology.kubernetes.io/region":{},"f:topology.kubernetes.io/zone":{}}},"f:spec":{"f:providerID":{}}} } {kube-controller-manager Update v1 2022-11-26 18:39:33 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}},"f:spec":{"f:podCIDR":{},"f:podCIDRs":{".":{},"v:\"10.64.1.0/24\"":{}}}} } {node-problem-detector Update v1 2022-11-26 18:50:20 +0000 UTC FieldsV1 {"f:status":{"f:conditions":{"k:{\"type\":\"CorruptDockerOverlay2\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentContainerdRestart\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentDockerRestart\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentKubeletRestart\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentUnregisterNetDevice\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"KernelDeadlock\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"ReadonlyFilesystem\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}}}}} status} {kube-controller-manager Update v1 2022-11-26 18:51:50 +0000 UTC FieldsV1 {"f:status":{"f:conditions":{"k:{\"type\":\"NetworkUnavailable\"}":{"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{}}}}} status} {kubelet Update v1 2022-11-26 18:52:09 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:csi.volume.kubernetes.io/nodeid":{}},"f:labels":{"f:topology.hostpath.csi/node":{}}},"f:status":{"f:conditions":{"k:{\"type\":\"DiskPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"MemoryPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"PIDPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"Ready\"}":{"f:lastHeartbeatTime":{},"f:message":{},"f:reason":{},"f:status":{}}},"f:images":{}}} status}]},Spec:NodeSpec{PodCIDR:10.64.1.0/24,DoNotUseExternalID:,ProviderID:gce://k8s-jkns-e2e-gke-ubuntu-slow/us-west1-b/bootstrap-e2e-minion-group-p1wq,Unschedulable:false,Taints:[]Taint{},ConfigSource:nil,PodCIDRs:[10.64.1.0/24],},Status:NodeStatus{Capacity:ResourceList{attachable-volumes-gce-pd: {{127 0} {<nil>} 127 DecimalSI},cpu: {{2 0} {<nil>} 2 DecimalSI},ephemeral-storage: {{101203873792 0} {<nil>} 98831908Ki BinarySI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{7815438336 0} {<nil>} BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Allocatable:ResourceList{attachable-volumes-gce-pd: {{127 0} {<nil>} 127 DecimalSI},cpu: {{2 0} {<nil>} 2 DecimalSI},ephemeral-storage: {{91083486262 0} {<nil>} 91083486262 DecimalSI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{7553294336 0} {<nil>} BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Phase:,Conditions:[]NodeCondition{NodeCondition{Type:FrequentDockerRestart,Status:False,LastHeartbeatTime:2022-11-26 18:50:20 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentDockerRestart,Message:docker is functioning properly,},NodeCondition{Type:FrequentContainerdRestart,Status:False,LastHeartbeatTime:2022-11-26 18:50:20 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentContainerdRestart,Message:containerd is functioning properly,},NodeCondition{Type:CorruptDockerOverlay2,Status:False,LastHeartbeatTime:2022-11-26 18:50:20 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoCorruptDockerOverlay2,Message:docker overlay2 is functioning properly,},NodeCondition{Type:KernelDeadlock,Status:False,LastHeartbeatTime:2022-11-26 18:50:20 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:KernelHasNoDeadlock,Message:kernel has no deadlock,},NodeCondition{Type:ReadonlyFilesystem,Status:False,LastHeartbeatTime:2022-11-26 18:50:20 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:FilesystemIsNotReadOnly,Message:Filesystem is not read-only,},NodeCondition{Type:FrequentUnregisterNetDevice,Status:False,LastHeartbeatTime:2022-11-26 18:50:20 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentUnregisterNetDevice,Message:node is functioning properly,},NodeCondition{Type:FrequentKubeletRestart,Status:False,LastHeartbeatTime:2022-11-26 18:50:20 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentKubeletRestart,Message:kubelet is functioning properly,},NodeCondition{Type:NetworkUnavailable,Status:False,LastHeartbeatTime:2022-11-26 18:39:46 +0000 UTC,LastTransitionTime:2022-11-26 18:39:46 +0000 UTC,Reason:RouteCreated,Message:RouteController created a route,},NodeCondition{Type:MemoryPressure,Status:False,LastHeartbeatTime:2022-11-26 18:52:09 +0000 UTC,LastTransitionTime:2022-11-26 18:39:32 +0000 UTC,Reason:KubeletHasSufficientMemory,Message:kubelet has sufficient memory available,},NodeCondition{Type:DiskPressure,Status:False,LastHeartbeatTime:2022-11-26 18:52:09 +0000 UTC,LastTransitionTime:2022-11-26 18:39:32 +0000 UTC,Reason:KubeletHasNoDiskPressure,Message:kubelet has no disk pressure,},NodeCondition{Type:PIDPressure,Status:False,LastHeartbeatTime:2022-11-26 18:52:09 +0000 UTC,LastTransitionTime:2022-11-26 18:39:32 +0000 UTC,Reason:KubeletHasSufficientPID,Message:kubelet has sufficient PID available,},NodeCondition{Type:Ready,Status:True,LastHeartbeatTime:2022-11-26 18:52:09 +0000 UTC,LastTransitionTime:2022-11-26 18:39:32 +0000 UTC,Reason:KubeletReady,Message:kubelet is posting ready status. AppArmor enabled,},},Addresses:[]NodeAddress{NodeAddress{Type:InternalIP,Address:10.138.0.3,},NodeAddress{Type:ExternalIP,Address:34.168.33.242,},NodeAddress{Type:InternalDNS,Address:bootstrap-e2e-minion-group-p1wq.c.k8s-jkns-e2e-gke-ubuntu-slow.internal,},NodeAddress{Type:Hostname,Address:bootstrap-e2e-minion-group-p1wq.c.k8s-jkns-e2e-gke-ubuntu-slow.internal,},},DaemonEndpoints:NodeDaemonEndpoints{KubeletEndpoint:DaemonEndpoint{Port:10250,},},NodeInfo:NodeSystemInfo{MachineID:af52744010a8d3f305ac13b550d67291,SystemUUID:af527440-10a8-d3f3-05ac-13b550d67291,BootID:aec04ea1-f168-4b94-9143-5f1e29a8fc63,KernelVersion:5.10.123+,OSImage:Container-Optimized OS from Google,ContainerRuntimeVersion:containerd://1.7.0-beta.0-149-gd06318622,KubeletVersion:v1.27.0-alpha.0.50+70617042976dc1,KubeProxyVersion:v1.27.0-alpha.0.50+70617042976dc1,OperatingSystem:linux,Architecture:amd64,},Images:[]ContainerImage{ContainerImage{Names:[registry.k8s.io/kube-proxy-amd64:v1.27.0-alpha.0.50_70617042976dc1],SizeBytes:67201736,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/agnhost@sha256:16bbf38c463a4223d8cfe4da12bc61010b082a79b4bb003e2d3ba3ece5dd5f9e registry.k8s.io/e2e-test-images/agnhost:2.43],SizeBytes:51706353,},ContainerImage{Names:[gke.gcr.io/prometheus-to-sd@sha256:e739643c3939ba0b161425f45a1989eedfc4a3b166db9a7100863296b4c70510 gke.gcr.io/prometheus-to-sd:v0.11.1-gke.1],SizeBytes:48742566,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-provisioner@sha256:ee3b525d5b89db99da3b8eb521d9cd90cb6e9ef0fbb651e98bb37be78d36b5b8 registry.k8s.io/sig-storage/csi-provisioner:v3.3.0],SizeBytes:25491225,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-resizer@sha256:425d8f1b769398127767b06ed97ce62578a3179bcb99809ce93a1649e025ffe7 registry.k8s.io/sig-storage/csi-resizer:v1.6.0],SizeBytes:24148884,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-snapshotter@sha256:291334908ddf71a4661fd7f6d9d97274de8a5378a2b6fdfeb2ce73414a34f82f registry.k8s.io/sig-storage/csi-snapshotter:v6.1.0],SizeBytes:23881995,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-attacher@sha256:9a685020911e2725ad019dbce6e4a5ab93d51e3d4557f115e64343345e05781b registry.k8s.io/sig-storage/csi-attacher:v4.0.0],SizeBytes:23847201,},ContainerImage{Names:[registry.k8s.io/sig-storage/hostpathplugin@sha256:92257881c1d6493cf18299a24af42330f891166560047902b8d431fb66b01af5 registry.k8s.io/sig-storage/hostpathplugin:v1.9.0],SizeBytes:18758628,},ContainerImage{Names:[registry.k8s.io/coredns/coredns@sha256:8e352a029d304ca7431c6507b56800636c321cb52289686a581ab70aaa8a2e2a registry.k8s.io/coredns/coredns:v1.9.3],SizeBytes:14837849,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-node-driver-registrar@sha256:0103eee7c35e3e0b5cd8cdca9850dc71c793cdeb6669d8be7a89440da2d06ae4 registry.k8s.io/sig-storage/csi-node-driver-registrar:v2.5.1],SizeBytes:9133109,},ContainerImage{Names:[registry.k8s.io/sig-storage/livenessprobe@sha256:933940f13b3ea0abc62e656c1aa5c5b47c04b15d71250413a6b821bd0c58b94e registry.k8s.io/sig-storage/livenessprobe:v2.7.0],SizeBytes:8688564,},ContainerImage{Names:[registry.k8s.io/kas-network-proxy/proxy-agent@sha256:48f2a4ec3e10553a81b8dd1c6fa5fe4bcc9617f78e71c1ca89c6921335e2d7da registry.k8s.io/kas-network-proxy/proxy-agent:v0.0.33],SizeBytes:8512162,},ContainerImage{Names:[registry.k8s.io/metadata-proxy@sha256:e914645f22e946bce5165737e1b244e0a296ad1f0f81a9531adc57af2780978a registry.k8s.io/metadata-proxy:v0.1.12],SizeBytes:5301657,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/busybox@sha256:c318242786b139d18676b1c09a0ad7f15fc17f8f16a5b2e625cd0dc8c9703daf registry.k8s.io/e2e-test-images/busybox:1.29-2],SizeBytes:732424,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/busybox@sha256:2e0f836850e09b8b7cc937681d6194537a09fbd5f6b9e08f4d646a85128e8937 registry.k8s.io/e2e-test-images/busybox:1.29-4],SizeBytes:731990,},ContainerImage{Names:[registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097 registry.k8s.io/pause:3.9],SizeBytes:321520,},ContainerImage{Names:[registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d registry.k8s.io/pause:3.8],SizeBytes:311286,},},VolumesInUse:[],VolumesAttached:[]AttachedVolume{},Config:nil,},} Nov 26 18:52:11.145: INFO: Logging kubelet events for node bootstrap-e2e-minion-group-p1wq Nov 26 18:52:11.287: INFO: Logging pods the kubelet thinks is on node bootstrap-e2e-minion-group-p1wq Nov 26 18:52:11.378: INFO: Unable to retrieve kubelet pods for node bootstrap-e2e-minion-group-p1wq: error trying to reach service: No agent available [DeferCleanup (Each)] [sig-cli] Kubectl client tear down framework | framework.go:193 STEP: Destroying namespace "kubectl-8079" for this suite. 11/26/22 18:52:11.378
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-cloud\-provider\-gcp\]\sAddon\supdate\sshould\spropagate\sadd\-on\sfile\schanges\s\[Slow\]$'
test/e2e/cloud/gcp/addon_update.go:353 k8s.io/kubernetes/test/e2e/cloud/gcp.waitForReplicationControllerInAddonTest({0x801de88?, 0xc001b06680?}, {0x75ce977?, 0x4?}, {0x760025e?, 0xc0040a17d0?}, 0x1d?) test/e2e/cloud/gcp/addon_update.go:353 +0x54 k8s.io/kubernetes/test/e2e/cloud/gcp.glob..func1.3() test/e2e/cloud/gcp/addon_update.go:311 +0x1025 There were additional failures detected after the initial failure: [FAILED] Nov 26 18:57:21.820: failed to list events in namespace "addon-update-test-1276": Get "https://34.83.88.61/api/v1/namespaces/addon-update-test-1276/events": dial tcp 34.83.88.61:443: connect: connection refused In [DeferCleanup (Each)] at: test/e2e/framework/debug/dump.go:44 ---------- [FAILED] Nov 26 18:57:21.860: Couldn't delete ns: "addon-update-test-1276": Delete "https://34.83.88.61/api/v1/namespaces/addon-update-test-1276": dial tcp 34.83.88.61:443: connect: connection refused (&url.Error{Op:"Delete", URL:"https://34.83.88.61/api/v1/namespaces/addon-update-test-1276", Err:(*net.OpError)(0xc002c29400)}) In [DeferCleanup (Each)] at: test/e2e/framework/framework.go:370from junit_01.xml
[BeforeEach] [sig-cloud-provider-gcp] Addon update set up framework | framework.go:178 STEP: Creating a kubernetes client 11/26/22 18:52:19.027 Nov 26 18:52:19.027: INFO: >>> kubeConfig: /workspace/.kube/config STEP: Building a namespace api object, basename addon-update-test 11/26/22 18:52:19.029 STEP: Waiting for a default service account to be provisioned in namespace 11/26/22 18:52:19.273 STEP: Waiting for kube-root-ca.crt to be provisioned in namespace 11/26/22 18:52:19.373 [BeforeEach] [sig-cloud-provider-gcp] Addon update test/e2e/framework/metrics/init/init.go:31 [BeforeEach] [sig-cloud-provider-gcp] Addon update test/e2e/cloud/gcp/addon_update.go:223 [It] should propagate add-on file changes [Slow] test/e2e/cloud/gcp/addon_update.go:244 Nov 26 18:52:19.830: INFO: Executing 'mkdir -p addon-test-dir/addon-update-test-1276' on 34.83.88.61:22 Nov 26 18:52:20.000: INFO: Writing remote file 'addon-test-dir/addon-update-test-1276/addon-reconcile-controller.yaml' on 34.83.88.61:22 Nov 26 18:52:20.118: INFO: Writing remote file 'addon-test-dir/addon-update-test-1276/addon-reconcile-controller-Updated.yaml' on 34.83.88.61:22 Nov 26 18:52:20.236: INFO: Writing remote file 'addon-test-dir/addon-update-test-1276/addon-deprecated-label-service.yaml' on 34.83.88.61:22 Nov 26 18:52:20.354: INFO: Writing remote file 'addon-test-dir/addon-update-test-1276/addon-deprecated-label-service-updated.yaml' on 34.83.88.61:22 Nov 26 18:52:20.472: INFO: Writing remote file 'addon-test-dir/addon-update-test-1276/addon-ensure-exists-service.yaml' on 34.83.88.61:22 Nov 26 18:52:20.590: INFO: Writing remote file 'addon-test-dir/addon-update-test-1276/addon-ensure-exists-service-updated.yaml' on 34.83.88.61:22 Nov 26 18:52:20.707: INFO: Writing remote file 'addon-test-dir/addon-update-test-1276/invalid-addon-controller.yaml' on 34.83.88.61:22 Nov 26 18:52:20.825: INFO: Executing 'sudo rm -rf /etc/kubernetes/addons/addon-test-dir' on 34.83.88.61:22 Nov 26 18:52:20.925: INFO: Executing 'sudo mkdir -p /etc/kubernetes/addons/addon-test-dir/addon-update-test-1276' on 34.83.88.61:22 STEP: copy invalid manifests to the destination dir 11/26/22 18:52:21.013 Nov 26 18:52:21.013: INFO: Executing 'sudo cp addon-test-dir/addon-update-test-1276/invalid-addon-controller.yaml /etc/kubernetes/addons/addon-test-dir/addon-update-test-1276/invalid-addon-controller.yaml' on 34.83.88.61:22 STEP: copy new manifests 11/26/22 18:52:21.105 Nov 26 18:52:21.105: INFO: Executing 'sudo cp addon-test-dir/addon-update-test-1276/addon-reconcile-controller.yaml /etc/kubernetes/addons/addon-test-dir/addon-update-test-1276/addon-reconcile-controller.yaml' on 34.83.88.61:22 Nov 26 18:52:21.197: INFO: Executing 'sudo cp addon-test-dir/addon-update-test-1276/addon-deprecated-label-service.yaml /etc/kubernetes/addons/addon-test-dir/addon-update-test-1276/addon-deprecated-label-service.yaml' on 34.83.88.61:22 Nov 26 18:52:21.286: INFO: Executing 'sudo cp addon-test-dir/addon-update-test-1276/addon-ensure-exists-service.yaml /etc/kubernetes/addons/addon-test-dir/addon-update-test-1276/addon-ensure-exists-service.yaml' on 34.83.88.61:22 Nov 26 18:52:21.444: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (replicationcontrollers "addon-reconcile-test" not found). Nov 26 18:52:24.534: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (replicationcontrollers "addon-reconcile-test" not found). Nov 26 18:52:27.495: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (replicationcontrollers "addon-reconcile-test" not found). Nov 26 18:52:30.513: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (replicationcontrollers "addon-reconcile-test" not found). Nov 26 18:52:33.492: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (replicationcontrollers "addon-reconcile-test" not found). Nov 26 18:52:36.534: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (replicationcontrollers "addon-reconcile-test" not found). Nov 26 18:52:39.507: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (replicationcontrollers "addon-reconcile-test" not found). Nov 26 18:52:42.528: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (replicationcontrollers "addon-reconcile-test" not found). Nov 26 18:52:45.512: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (replicationcontrollers "addon-reconcile-test" not found). Nov 26 18:52:48.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:52:51.483: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:52:54.483: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:52:57.483: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:53:00.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:53:03.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:53:06.483: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:53:09.483: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:53:12.485: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:53:15.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:53:18.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:53:21.487: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:53:24.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:53:27.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:53:30.483: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:53:33.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:53:36.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:53:39.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:53:42.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:53:45.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:53:48.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:53:51.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:53:54.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:53:57.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:54:00.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:54:03.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:54:06.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:54:09.483: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:54:12.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:54:15.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:54:18.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:54:21.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:54:24.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:54:27.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:54:30.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:54:33.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:54:36.485: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:54:39.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:54:42.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:54:49.040: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused - error from a previous attempt: read tcp 10.60.55.171:41400->34.83.88.61:443: read: connection reset by peer). Nov 26 18:54:51.483: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:54:54.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:54:57.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:55:00.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:55:03.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:55:06.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:55:09.483: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:55:12.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:55:15.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:55:18.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:55:21.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:55:24.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:55:27.485: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:55:30.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:55:33.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:55:36.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:55:39.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:55:42.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:55:45.483: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:55:48.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:55:51.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:55:54.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:55:57.485: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:56:00.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:56:03.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:56:06.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:56:09.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:56:12.485: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:56:15.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:56:18.483: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:56:21.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:56:24.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:56:27.488: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:56:30.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:56:33.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:56:36.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:56:39.483: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:56:42.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:56:45.493: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:56:48.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:56:51.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:56:54.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:56:57.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:57:00.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:57:03.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:57:06.483: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:57:09.483: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:57:12.484: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:57:15.483: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:57:18.483: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). ------------------------------ Progress Report for Ginkgo Process #3 Automatically polling progress: [sig-cloud-provider-gcp] Addon update should propagate add-on file changes [Slow] (Spec Runtime: 5m0.803s) test/e2e/cloud/gcp/addon_update.go:244 In [It] (Node Runtime: 5m0.001s) test/e2e/cloud/gcp/addon_update.go:244 At [By Step] copy new manifests (Step Runtime: 4m58.726s) test/e2e/cloud/gcp/addon_update.go:300 Spec Goroutine goroutine 2189 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0000820c8}, 0xc001a34168, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0000820c8}, 0x30?, 0x2fd9d05?, 0x30?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0000820c8}, 0x2bc362c?, 0xc0018eb680?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00486edc0?, 0x66e0100?, 0xacfb400?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/cloud/gcp.waitForReplicationController({0x801de88?, 0xc001b06680}, {0x75ce977, 0xb}, {0x760025e, 0x14}, 0x1, 0xc004802900?, 0x0?) test/e2e/cloud/gcp/addon_update.go:367 > k8s.io/kubernetes/test/e2e/cloud/gcp.waitForReplicationControllerInAddonTest({0x801de88?, 0xc001b06680?}, {0x75ce977?, 0x4?}, {0x760025e?, 0xc0040a17d0?}, 0x1d?) test/e2e/cloud/gcp/addon_update.go:353 > k8s.io/kubernetes/test/e2e/cloud/gcp.glob..func1.3() test/e2e/cloud/gcp/addon_update.go:311 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc002b2cfa0, 0x2fd9502}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 18:57:21.489: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:57:21.528: INFO: Get ReplicationController addon-reconcile-test in namespace kube-system failed (Get "https://34.83.88.61/api/v1/namespaces/kube-system/replicationcontrollers/addon-reconcile-test": dial tcp 34.83.88.61:443: connect: connection refused). Nov 26 18:57:21.528: INFO: Unexpected error: <*errors.errorString | 0xc004dac430>: { s: "error waiting for ReplicationController kube-system/addon-reconcile-test to appear: timed out waiting for the condition", } Nov 26 18:57:21.528: FAIL: error waiting for ReplicationController kube-system/addon-reconcile-test to appear: timed out waiting for the condition Full Stack Trace k8s.io/kubernetes/test/e2e/cloud/gcp.waitForReplicationControllerInAddonTest({0x801de88?, 0xc001b06680?}, {0x75ce977?, 0x4?}, {0x760025e?, 0xc0040a17d0?}, 0x1d?) test/e2e/cloud/gcp/addon_update.go:353 +0x54 k8s.io/kubernetes/test/e2e/cloud/gcp.glob..func1.3() test/e2e/cloud/gcp/addon_update.go:311 +0x1025 Nov 26 18:57:21.529: INFO: Cleaning up ensure exist class addon. Nov 26 18:57:21.568: INFO: Unexpected error: <*url.Error | 0xc003045a40>: { Op: "Delete", URL: "https://34.83.88.61/api/v1/namespaces/kube-system/services/addon-ensure-exists-test", Err: <*net.OpError | 0xc0039199a0>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc003696a80>{ IP: [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 255, 255, 34, 83, 88, 61], Port: 443, Zone: "", }, Err: <*os.SyscallError | 0xc0016f8a40>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Nov 26 18:57:21.569: FAIL: Delete "https://34.83.88.61/api/v1/namespaces/kube-system/services/addon-ensure-exists-test": dial tcp 34.83.88.61:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/cloud/gcp.glob..func1.3.1() test/e2e/cloud/gcp/addon_update.go:308 +0xe5 panic({0x70eb7e0, 0xc0001e4af0}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework.Fail({0xc0012be080, 0x77}, {0xc0005dd7a8?, 0xc0012be080?, 0xc0005dd7d0?}) test/e2e/framework/log.go:61 +0x145 k8s.io/kubernetes/test/e2e/framework.ExpectNoErrorWithOffset(0x1, {0x7fa3ee0, 0xc004dac430}, {0x0?, 0x760025e?, 0x14?}) test/e2e/framework/expect.go:76 +0x267 k8s.io/kubernetes/test/e2e/framework.ExpectNoError(...) test/e2e/framework/expect.go:43 k8s.io/kubernetes/test/e2e/cloud/gcp.waitForReplicationControllerInAddonTest({0x801de88?, 0xc001b06680?}, {0x75ce977?, 0x4?}, {0x760025e?, 0xc0040a17d0?}, 0x1d?) test/e2e/cloud/gcp/addon_update.go:353 +0x54 k8s.io/kubernetes/test/e2e/cloud/gcp.glob..func1.3() test/e2e/cloud/gcp/addon_update.go:311 +0x1025 Nov 26 18:57:21.569: INFO: Executing 'sudo rm -rf /etc/kubernetes/addons/addon-test-dir' on 34.83.88.61:22 Nov 26 18:57:21.658: INFO: Executing 'rm -rf addon-test-dir' on 34.83.88.61:22 [AfterEach] [sig-cloud-provider-gcp] Addon update test/e2e/framework/node/init/init.go:32 Nov 26 18:57:21.741: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready [AfterEach] [sig-cloud-provider-gcp] Addon update test/e2e/cloud/gcp/addon_update.go:237 [DeferCleanup (Each)] [sig-cloud-provider-gcp] Addon update test/e2e/framework/metrics/init/init.go:33 [DeferCleanup (Each)] [sig-cloud-provider-gcp] Addon update dump namespaces | framework.go:196 STEP: dump namespace information after failure 11/26/22 18:57:21.781 STEP: Collecting events from namespace "addon-update-test-1276". 11/26/22 18:57:21.781 Nov 26 18:57:21.820: INFO: Unexpected error: failed to list events in namespace "addon-update-test-1276": <*url.Error | 0xc003892060>: { Op: "Get", URL: "https://34.83.88.61/api/v1/namespaces/addon-update-test-1276/events", Err: <*net.OpError | 0xc003919bd0>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc002e1ff50>{ IP: [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 255, 255, 34, 83, 88, 61], Port: 443, Zone: "", }, Err: <*os.SyscallError | 0xc0016f8e80>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Nov 26 18:57:21.820: FAIL: failed to list events in namespace "addon-update-test-1276": Get "https://34.83.88.61/api/v1/namespaces/addon-update-test-1276/events": dial tcp 34.83.88.61:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/framework/debug.dumpEventsInNamespace(0xc0005d85c0, {0xc0040a17d0, 0x16}) test/e2e/framework/debug/dump.go:44 +0x191 k8s.io/kubernetes/test/e2e/framework/debug.DumpAllNamespaceInfo({0x801de88, 0xc001b06680}, {0xc0040a17d0, 0x16}) test/e2e/framework/debug/dump.go:62 +0x8d k8s.io/kubernetes/test/e2e/framework/debug/init.init.0.func1.1(0xc0005d8650?, {0xc0040a17d0?, 0x7fa7740?}) test/e2e/framework/debug/init/init.go:34 +0x32 k8s.io/kubernetes/test/e2e/framework.(*Framework).dumpNamespaceInfo.func1() test/e2e/framework/framework.go:274 +0x6d k8s.io/kubernetes/test/e2e/framework.(*Framework).dumpNamespaceInfo(0xc000edfd10) test/e2e/framework/framework.go:271 +0x179 reflect.Value.call({0x6627cc0?, 0xc00133acd0?, 0xae73300?}, {0x75b6e72, 0x4}, {0xae73300, 0x0, 0x3cf0a1a?}) /usr/local/go/src/reflect/value.go:584 +0x8c5 reflect.Value.Call({0x6627cc0?, 0xc00133acd0?, 0x100010000?}, {0xae73300?, 0xc003490350?, 0x789a580?}) /usr/local/go/src/reflect/value.go:368 +0xbc [DeferCleanup (Each)] [sig-cloud-provider-gcp] Addon update tear down framework | framework.go:193 STEP: Destroying namespace "addon-update-test-1276" for this suite. 11/26/22 18:57:21.821 Nov 26 18:57:21.860: FAIL: Couldn't delete ns: "addon-update-test-1276": Delete "https://34.83.88.61/api/v1/namespaces/addon-update-test-1276": dial tcp 34.83.88.61:443: connect: connection refused (&url.Error{Op:"Delete", URL:"https://34.83.88.61/api/v1/namespaces/addon-update-test-1276", Err:(*net.OpError)(0xc002c29400)}) Full Stack Trace k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach.func1() test/e2e/framework/framework.go:370 +0x4fe k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach(0xc000edfd10) test/e2e/framework/framework.go:383 +0x1ca reflect.Value.call({0x6627cc0?, 0xc00133ac10?, 0xc004719fb0?}, {0x75b6e72, 0x4}, {0xae73300, 0x0, 0x0?}) /usr/local/go/src/reflect/value.go:584 +0x8c5 reflect.Value.Call({0x6627cc0?, 0xc00133ac10?, 0x0?}, {0xae73300?, 0x5?, 0xc004e95ad0?}) /usr/local/go/src/reflect/value.go:368 +0xbc
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-network\]\sLoadBalancers\sESIPP\s\[Slow\]\sshould\shandle\supdates\sto\sExternalTrafficPolicy\sfield$'
test/e2e/framework/network/utils.go:834 k8s.io/kubernetes/test/e2e/framework/network.(*NetworkingTestConfig).setup(0xc001358620, 0x3c?) test/e2e/framework/network/utils.go:834 +0x545 k8s.io/kubernetes/test/e2e/framework/network.NewNetworkingTestConfig(0xc0011c4000, {0x0, 0x0, 0x7f8f6d0?}) test/e2e/framework/network/utils.go:131 +0x125 k8s.io/kubernetes/test/e2e/network.glob..func20.7() test/e2e/network/loadbalancer.go:1544 +0x417from junit_01.xml
[BeforeEach] [sig-network] LoadBalancers ESIPP [Slow] set up framework | framework.go:178 STEP: Creating a kubernetes client 11/26/22 18:43:57.153 Nov 26 18:43:57.154: INFO: >>> kubeConfig: /workspace/.kube/config STEP: Building a namespace api object, basename esipp 11/26/22 18:43:57.155 STEP: Waiting for a default service account to be provisioned in namespace 11/26/22 18:43:57.35 STEP: Waiting for kube-root-ca.crt to be provisioned in namespace 11/26/22 18:43:57.487 [BeforeEach] [sig-network] LoadBalancers ESIPP [Slow] test/e2e/framework/metrics/init/init.go:31 [BeforeEach] [sig-network] LoadBalancers ESIPP [Slow] test/e2e/network/loadbalancer.go:1250 [It] should handle updates to ExternalTrafficPolicy field test/e2e/network/loadbalancer.go:1480 STEP: creating a service esipp-5143/external-local-update with type=LoadBalancer 11/26/22 18:43:57.761 STEP: setting ExternalTrafficPolicy=Local 11/26/22 18:43:57.761 STEP: waiting for loadbalancer for service esipp-5143/external-local-update 11/26/22 18:43:57.896 Nov 26 18:43:57.896: INFO: Waiting up to 15m0s for service "external-local-update" to have a LoadBalancer STEP: creating a pod to be part of the service external-local-update 11/26/22 18:46:40.002 Nov 26 18:46:40.047: INFO: Waiting up to 2m0s for 1 pods to be created Nov 26 18:46:40.088: INFO: Found all 1 pods Nov 26 18:46:40.088: INFO: Waiting up to 2m0s for 1 pods to be running and ready: [external-local-update-nfqlg] Nov 26 18:46:40.088: INFO: Waiting up to 2m0s for pod "external-local-update-nfqlg" in namespace "esipp-5143" to be "running and ready" Nov 26 18:46:40.128: INFO: Pod "external-local-update-nfqlg": Phase="Pending", Reason="", readiness=false. Elapsed: 40.553195ms Nov 26 18:46:40.128: INFO: Error evaluating pod condition running and ready: want pod 'external-local-update-nfqlg' on 'bootstrap-e2e-minion-group-dzls' to be 'Running' but was 'Pending' Nov 26 18:46:42.176: INFO: Pod "external-local-update-nfqlg": Phase="Running", Reason="", readiness=true. Elapsed: 2.087881971s Nov 26 18:46:42.176: INFO: Pod "external-local-update-nfqlg" satisfied condition "running and ready" Nov 26 18:46:42.176: INFO: Wanted all 1 pods to be running and ready. Result: true. Pods: [external-local-update-nfqlg] STEP: waiting for loadbalancer for service esipp-5143/external-local-update 11/26/22 18:46:42.176 Nov 26 18:46:42.176: INFO: Waiting up to 15m0s for service "external-local-update" to have a LoadBalancer STEP: turning ESIPP off 11/26/22 18:46:42.226 STEP: Performing setup for networking test in namespace esipp-5143 11/26/22 18:46:43.44 STEP: creating a selector 11/26/22 18:46:43.44 STEP: Creating the service pods in kubernetes 11/26/22 18:46:43.44 Nov 26 18:46:43.440: INFO: Waiting up to 10m0s for all (but 0) nodes to be schedulable Nov 26 18:46:43.698: INFO: Waiting up to 5m0s for pod "netserver-0" in namespace "esipp-5143" to be "running and ready" Nov 26 18:46:43.746: INFO: Pod "netserver-0": Phase="Pending", Reason="", readiness=false. Elapsed: 47.838236ms Nov 26 18:46:43.746: INFO: The phase of Pod netserver-0 is Pending, waiting for it to be Running (with Ready = true) Nov 26 18:46:45.790: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 2.092370717s Nov 26 18:46:45.790: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:46:49.634: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 5.935861714s Nov 26 18:46:49.634: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:46:49.837: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 6.139420978s Nov 26 18:46:49.837: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:46:51.787: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 8.089564655s Nov 26 18:46:51.787: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:46:53.787: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 10.089014542s Nov 26 18:46:53.787: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:46:55.810: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 12.11213206s Nov 26 18:46:55.810: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:46:57.798: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 14.100534206s Nov 26 18:46:57.798: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:46:59.822: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 16.123867023s Nov 26 18:46:59.822: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:47:01.800: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 18.102411755s Nov 26 18:47:01.800: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:47:03.790: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 20.091762901s Nov 26 18:47:03.790: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:47:05.845: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 22.146907012s Nov 26 18:47:05.845: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:47:07.793: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 24.095121955s Nov 26 18:47:07.793: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:47:09.788: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 26.090188246s Nov 26 18:47:09.788: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:47:11.825: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 28.127501323s Nov 26 18:47:11.825: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:47:13.788: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 30.0902288s Nov 26 18:47:13.788: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:47:15.801: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 32.103453009s Nov 26 18:47:15.801: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:47:17.794: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 34.096122179s Nov 26 18:47:17.794: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:47:19.791: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 36.093645838s Nov 26 18:47:19.791: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:47:21.794: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 38.096441006s Nov 26 18:47:21.794: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:47:23.808: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 40.109855533s Nov 26 18:47:23.808: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:47:25.788: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 42.090535309s Nov 26 18:47:25.788: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:47:27.787: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 44.088976609s Nov 26 18:47:27.787: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:47:29.802: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 46.104215134s Nov 26 18:47:29.802: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:47:31.791: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 48.093313513s Nov 26 18:47:31.791: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:47:33.791: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 50.092891359s Nov 26 18:47:33.791: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:47:35.792: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 52.094290391s Nov 26 18:47:35.792: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:47:37.791: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 54.093636294s Nov 26 18:47:37.791: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:47:39.791: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 56.093023238s Nov 26 18:47:39.791: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:47:41.890: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 58.192466363s Nov 26 18:47:41.890: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:47:43.793: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 1m0.094984677s Nov 26 18:47:43.793: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:47:45.798: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=true. Elapsed: 1m2.09978717s Nov 26 18:47:45.798: INFO: The phase of Pod netserver-0 is Running (Ready = true) Nov 26 18:47:45.798: INFO: Pod "netserver-0" satisfied condition "running and ready" Nov 26 18:47:45.843: INFO: Waiting up to 5m0s for pod "netserver-1" in namespace "esipp-5143" to be "running and ready" Nov 26 18:47:45.914: INFO: Pod "netserver-1": Phase="Running", Reason="", readiness=true. Elapsed: 70.750663ms Nov 26 18:47:45.914: INFO: The phase of Pod netserver-1 is Running (Ready = true) Nov 26 18:47:45.914: INFO: Pod "netserver-1" satisfied condition "running and ready" Nov 26 18:47:45.983: INFO: Waiting up to 5m0s for pod "netserver-2" in namespace "esipp-5143" to be "running and ready" Nov 26 18:47:46.056: INFO: Pod "netserver-2": Phase="Running", Reason="", readiness=false. Elapsed: 72.880905ms Nov 26 18:47:46.056: INFO: The phase of Pod netserver-2 is Running (Ready = false) Nov 26 18:47:48.103: INFO: Pod "netserver-2": Phase="Running", Reason="", readiness=false. Elapsed: 2.120117113s Nov 26 18:47:48.103: INFO: The phase of Pod netserver-2 is Running (Ready = false) Nov 26 18:47:50.100: INFO: Pod "netserver-2": Phase="Running", Reason="", readiness=false. Elapsed: 4.116853375s Nov 26 18:47:50.100: INFO: The phase of Pod netserver-2 is Running (Ready = false) Nov 26 18:47:52.116: INFO: Pod "netserver-2": Phase="Running", Reason="", readiness=false. Elapsed: 6.133061172s Nov 26 18:47:52.116: INFO: The phase of Pod netserver-2 is Running (Ready = false) Nov 26 18:47:54.098: INFO: Pod "netserver-2": Phase="Running", Reason="", readiness=false. Elapsed: 8.114838963s Nov 26 18:47:54.098: INFO: The phase of Pod netserver-2 is Running (Ready = false) Nov 26 18:47:56.098: INFO: Pod "netserver-2": Phase="Running", Reason="", readiness=false. Elapsed: 10.114605416s Nov 26 18:47:56.098: INFO: The phase of Pod netserver-2 is Running (Ready = false) Nov 26 18:47:58.098: INFO: Pod "netserver-2": Phase="Running", Reason="", readiness=false. Elapsed: 12.114549959s Nov 26 18:47:58.098: INFO: The phase of Pod netserver-2 is Running (Ready = false) Nov 26 18:48:00.099: INFO: Pod "netserver-2": Phase="Running", Reason="", readiness=false. Elapsed: 14.115766977s Nov 26 18:48:00.099: INFO: The phase of Pod netserver-2 is Running (Ready = false) Nov 26 18:48:02.099: INFO: Pod "netserver-2": Phase="Running", Reason="", readiness=false. Elapsed: 16.116055333s Nov 26 18:48:02.099: INFO: The phase of Pod netserver-2 is Running (Ready = false) Nov 26 18:48:04.099: INFO: Pod "netserver-2": Phase="Running", Reason="", readiness=false. Elapsed: 18.116471598s Nov 26 18:48:04.099: INFO: The phase of Pod netserver-2 is Running (Ready = false) Nov 26 18:48:06.107: INFO: Pod "netserver-2": Phase="Running", Reason="", readiness=false. Elapsed: 20.123578377s Nov 26 18:48:06.107: INFO: The phase of Pod netserver-2 is Running (Ready = false) Nov 26 18:48:08.099: INFO: Pod "netserver-2": Phase="Running", Reason="", readiness=false. Elapsed: 22.116331096s Nov 26 18:48:08.099: INFO: The phase of Pod netserver-2 is Running (Ready = false) Nov 26 18:48:10.099: INFO: Pod "netserver-2": Phase="Running", Reason="", readiness=false. Elapsed: 24.115944209s Nov 26 18:48:10.099: INFO: The phase of Pod netserver-2 is Running (Ready = false) Nov 26 18:48:12.098: INFO: Pod "netserver-2": Phase="Running", Reason="", readiness=false. Elapsed: 26.115465193s Nov 26 18:48:12.098: INFO: The phase of Pod netserver-2 is Running (Ready = false) Nov 26 18:48:14.099: INFO: Pod "netserver-2": Phase="Running", Reason="", readiness=true. Elapsed: 28.115589766s Nov 26 18:48:14.099: INFO: The phase of Pod netserver-2 is Running (Ready = true) Nov 26 18:48:14.099: INFO: Pod "netserver-2" satisfied condition "running and ready" STEP: Creating test pods 11/26/22 18:48:14.14 Nov 26 18:48:14.214: INFO: Waiting up to 5m0s for pod "test-container-pod" in namespace "esipp-5143" to be "running" Nov 26 18:48:14.255: INFO: Pod "test-container-pod": Phase="Pending", Reason="", readiness=false. Elapsed: 41.252008ms Nov 26 18:48:16.318: INFO: Pod "test-container-pod": Phase="Pending", Reason="", readiness=false. Elapsed: 2.104749226s Nov 26 18:48:18.298: INFO: Pod "test-container-pod": Phase="Pending", Reason="", readiness=false. Elapsed: 4.084400662s Nov 26 18:48:20.297: INFO: Pod "test-container-pod": Phase="Pending", Reason="", readiness=false. Elapsed: 6.083729933s Nov 26 18:48:22.297: INFO: Pod "test-container-pod": Phase="Running", Reason="", readiness=true. Elapsed: 8.083398725s Nov 26 18:48:22.297: INFO: Pod "test-container-pod" satisfied condition "running" Nov 26 18:48:22.338: INFO: Setting MaxTries for pod polling to 39 for networking test based on endpoint count 3 STEP: Getting node addresses 11/26/22 18:48:22.338 Nov 26 18:48:22.339: INFO: Waiting up to 10m0s for all (but 0) nodes to be schedulable STEP: Creating the service on top of the pods in kubernetes 11/26/22 18:48:22.428 Nov 26 18:48:22.548: INFO: Service node-port-service in namespace esipp-5143 found. Nov 26 18:48:22.688: INFO: Service session-affinity-service in namespace esipp-5143 found. STEP: Waiting for NodePort service to expose endpoint 11/26/22 18:48:22.734 Nov 26 18:48:23.734: INFO: Waiting for amount of service:node-port-service endpoints to be 3 Nov 26 18:48:24.735: INFO: Waiting for amount of service:node-port-service endpoints to be 3 Nov 26 18:48:25.735: INFO: Waiting for amount of service:node-port-service endpoints to be 3 Nov 26 18:48:26.734: INFO: Waiting for amount of service:node-port-service endpoints to be 3 Nov 26 18:48:27.734: INFO: Waiting for amount of service:node-port-service endpoints to be 3 Nov 26 18:48:28.734: INFO: Waiting for amount of service:node-port-service endpoints to be 3 Nov 26 18:48:29.735: INFO: Waiting for amount of service:node-port-service endpoints to be 3 Nov 26 18:48:30.734: INFO: Waiting for amount of service:node-port-service endpoints to be 3 Nov 26 18:48:31.735: INFO: Waiting for amount of service:node-port-service endpoints to be 3 Nov 26 18:48:32.735: INFO: Waiting for amount of service:node-port-service endpoints to be 3 Nov 26 18:48:33.735: INFO: Waiting for amount of service:node-port-service endpoints to be 3 Nov 26 18:48:34.734: INFO: Waiting for amount of service:node-port-service endpoints to be 3 Nov 26 18:48:35.734: INFO: Waiting for amount of service:node-port-service endpoints to be 3 Nov 26 18:48:36.735: INFO: Waiting for amount of service:node-port-service endpoints to be 3 Nov 26 18:48:37.735: INFO: Waiting for amount of service:node-port-service endpoints to be 3 Nov 26 18:48:38.735: INFO: Waiting for amount of service:node-port-service endpoints to be 3 Nov 26 18:48:39.735: INFO: Waiting for amount of service:node-port-service endpoints to be 3 Nov 26 18:48:40.735: INFO: Waiting for amount of service:node-port-service endpoints to be 3 Nov 26 18:48:41.734: INFO: Waiting for amount of service:node-port-service endpoints to be 3 Nov 26 18:48:42.735: INFO: Waiting for amount of service:node-port-service endpoints to be 3 Nov 26 18:48:43.735: INFO: Waiting for amount of service:node-port-service endpoints to be 3 Nov 26 18:48:44.734: INFO: Waiting for amount of service:node-port-service endpoints to be 3 Nov 26 18:48:45.734: INFO: Waiting for amount of service:node-port-service endpoints to be 3 Nov 26 18:48:46.734: INFO: Waiting for amount of service:node-port-service endpoints to be 3 Nov 26 18:48:47.735: INFO: Waiting for amount of service:node-port-service endpoints to be 3 Nov 26 18:48:48.735: INFO: Waiting for amount of service:node-port-service endpoints to be 3 Nov 26 18:48:49.735: INFO: Waiting for amount of service:node-port-service endpoints to be 3 Nov 26 18:48:50.735: INFO: Waiting for amount of service:node-port-service endpoints to be 3 Nov 26 18:48:51.734: INFO: Waiting for amount of service:node-port-service endpoints to be 3 Nov 26 18:48:52.735: INFO: Waiting for amount of service:node-port-service endpoints to be 3 Nov 26 18:48:52.784: INFO: Waiting for amount of service:node-port-service endpoints to be 3 Nov 26 18:48:52.828: INFO: Unexpected error: failed to validate endpoints for service node-port-service in namespace: esipp-5143: <*errors.errorString | 0xc000113c60>: { s: "timed out waiting for the condition", } Nov 26 18:48:52.829: FAIL: failed to validate endpoints for service node-port-service in namespace: esipp-5143: timed out waiting for the condition Full Stack Trace k8s.io/kubernetes/test/e2e/framework/network.(*NetworkingTestConfig).setup(0xc001358620, 0x3c?) test/e2e/framework/network/utils.go:834 +0x545 k8s.io/kubernetes/test/e2e/framework/network.NewNetworkingTestConfig(0xc0011c4000, {0x0, 0x0, 0x7f8f6d0?}) test/e2e/framework/network/utils.go:131 +0x125 k8s.io/kubernetes/test/e2e/network.glob..func20.7() test/e2e/network/loadbalancer.go:1544 +0x417 Nov 26 18:48:52.938: INFO: Waiting up to 15m0s for service "external-local-update" to have no LoadBalancer ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should handle updates to ExternalTrafficPolicy field (Spec Runtime: 5m0.531s) test/e2e/network/loadbalancer.go:1480 In [It] (Node Runtime: 5m0.002s) test/e2e/network/loadbalancer.go:1480 At [By Step] Waiting for NodePort service to expose endpoint (Step Runtime: 34.951s) test/e2e/framework/network/utils.go:832 Spec Goroutine goroutine 870 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0028c9488, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x88?, 0x2fd9d05?, 0x48?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc00407eee8?, 0xc00407eed8?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:460 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.Poll(0x7fff59af84fa?, 0xa?, 0x7fe0bc8?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:445 k8s.io/kubernetes/test/e2e/framework/providers/gce.(*Provider).EnsureLoadBalancerResourcesDeleted(0xc0002e2358, {0xc000ce5810, 0xe}, {0x77c6ae2, 0x2}) test/e2e/framework/providers/gce/gce.go:195 k8s.io/kubernetes/test/e2e/framework.EnsureLoadBalancerResourcesDeleted(...) test/e2e/framework/util.go:551 k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancerDestroy.func1() test/e2e/framework/service/jig.go:602 k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancerDestroy(0xc001f0bdb0, {0xc000ce5810?, 0x0?}, 0x0?, 0x0?) test/e2e/framework/service/jig.go:614 k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).ChangeServiceType(0x0?, {0x75c5095?, 0x0?}, 0x0?) test/e2e/framework/service/jig.go:186 > k8s.io/kubernetes/test/e2e/network.glob..func20.7.1() test/e2e/network/loadbalancer.go:1494 panic({0x70eb7e0, 0xc000677b20}) /usr/local/go/src/runtime/panic.go:884 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2.Fail({0xc00008efc0, 0x8d}, {0xc00407f4b0?, 0x75b521a?, 0xc00407f4d0?}) vendor/github.com/onsi/ginkgo/v2/core_dsl.go:352 k8s.io/kubernetes/test/e2e/framework.Fail({0xc002e6bb80, 0x78}, {0xc00407f548?, 0x76740e9?, 0xc00407f570?}) test/e2e/framework/log.go:61 k8s.io/kubernetes/test/e2e/framework.ExpectNoErrorWithOffset(0x1, {0x7fa3ee0, 0xc000113c60}, {0xc00405be60?, 0x75ee1b4?, 0x11?}) test/e2e/framework/expect.go:76 k8s.io/kubernetes/test/e2e/framework.ExpectNoError(...) test/e2e/framework/expect.go:43 > k8s.io/kubernetes/test/e2e/framework/network.(*NetworkingTestConfig).setup(0xc001358620, 0x3c?) test/e2e/framework/network/utils.go:834 > k8s.io/kubernetes/test/e2e/framework/network.NewNetworkingTestConfig(0xc0011c4000, {0x0, 0x0, 0x7f8f6d0?}) test/e2e/framework/network/utils.go:131 > k8s.io/kubernetes/test/e2e/network.glob..func20.7() test/e2e/network/loadbalancer.go:1544 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0x2d591ce, 0xc002b5e480}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ [AfterEach] [sig-network] LoadBalancers ESIPP [Slow] test/e2e/framework/node/init/init.go:32 Nov 26 18:49:03.193: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready [AfterEach] [sig-network] LoadBalancers ESIPP [Slow] test/e2e/network/loadbalancer.go:1260 Nov 26 18:49:03.243: INFO: Output of kubectl describe svc: Nov 26 18:49:03.243: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=esipp-5143 describe svc --namespace=esipp-5143' Nov 26 18:49:03.877: INFO: stderr: "" Nov 26 18:49:03.877: INFO: stdout: "Name: external-local-update\nNamespace: esipp-5143\nLabels: testid=external-local-update-7b6ad3a1-b2a2-41b4-a01e-b9c219521b9c\nAnnotations: <none>\nSelector: testid=external-local-update-7b6ad3a1-b2a2-41b4-a01e-b9c219521b9c\nType: ClusterIP\nIP Family Policy: SingleStack\nIP Families: IPv4\nIP: 10.0.86.4\nIPs: 10.0.86.4\nPort: <unset> 80/TCP\nTargetPort: 80/TCP\nEndpoints: 10.64.3.82:80\nSession Affinity: None\nEvents:\n Type Reason Age From Message\n ---- ------ ---- ---- -------\n Normal EnsuringLoadBalancer 3m service-controller Ensuring load balancer\n Normal EnsuredLoadBalancer 2m25s service-controller Ensured load balancer\n Normal ExternalTrafficPolicy 2m21s service-controller Local -> Cluster\n Normal Type 11s service-controller LoadBalancer -> ClusterIP\n Normal DeletingLoadBalancer 5s service-controller Deleting load balancer\n\n\nName: node-port-service\nNamespace: esipp-5143\nLabels: <none>\nAnnotations: <none>\nSelector: selector-360f354e-78fd-4b78-86c3-ba7795446c8e=true\nType: NodePort\nIP Family Policy: SingleStack\nIP Families: IPv4\nIP: 10.0.124.101\nIPs: 10.0.124.101\nPort: http 80/TCP\nTargetPort: 8083/TCP\nNodePort: http 32707/TCP\nEndpoints: 10.64.2.81:8083,10.64.3.93:8083\nPort: udp 90/UDP\nTargetPort: 8081/UDP\nNodePort: udp 32104/UDP\nEndpoints: 10.64.2.81:8081,10.64.3.93:8081\nSession Affinity: None\nExternal Traffic Policy: Cluster\nEvents: <none>\n\n\nName: session-affinity-service\nNamespace: esipp-5143\nLabels: <none>\nAnnotations: <none>\nSelector: selector-360f354e-78fd-4b78-86c3-ba7795446c8e=true\nType: NodePort\nIP Family Policy: SingleStack\nIP Families: IPv4\nIP: 10.0.88.58\nIPs: 10.0.88.58\nPort: http 80/TCP\nTargetPort: 8083/TCP\nNodePort: http 32504/TCP\nEndpoints: 10.64.2.81:8083,10.64.3.93:8083\nPort: udp 90/UDP\nTargetPort: 8081/UDP\nNodePort: udp 30123/UDP\nEndpoints: 10.64.2.81:8081,10.64.3.93:8081\nSession Affinity: ClientIP\nExternal Traffic Policy: Cluster\nEvents: <none>\n" Nov 26 18:49:03.877: INFO: Name: external-local-update Namespace: esipp-5143 Labels: testid=external-local-update-7b6ad3a1-b2a2-41b4-a01e-b9c219521b9c Annotations: <none> Selector: testid=external-local-update-7b6ad3a1-b2a2-41b4-a01e-b9c219521b9c Type: ClusterIP IP Family Policy: SingleStack IP Families: IPv4 IP: 10.0.86.4 IPs: 10.0.86.4 Port: <unset> 80/TCP TargetPort: 80/TCP Endpoints: 10.64.3.82:80 Session Affinity: None Events: Type Reason Age From Message ---- ------ ---- ---- ------- Normal EnsuringLoadBalancer 3m service-controller Ensuring load balancer Normal EnsuredLoadBalancer 2m25s service-controller Ensured load balancer Normal ExternalTrafficPolicy 2m21s service-controller Local -> Cluster Normal Type 11s service-controller LoadBalancer -> ClusterIP Normal DeletingLoadBalancer 5s service-controller Deleting load balancer Name: node-port-service Namespace: esipp-5143 Labels: <none> Annotations: <none> Selector: selector-360f354e-78fd-4b78-86c3-ba7795446c8e=true Type: NodePort IP Family Policy: SingleStack IP Families: IPv4 IP: 10.0.124.101 IPs: 10.0.124.101 Port: http 80/TCP TargetPort: 8083/TCP NodePort: http 32707/TCP Endpoints: 10.64.2.81:8083,10.64.3.93:8083 Port: udp 90/UDP TargetPort: 8081/UDP NodePort: udp 32104/UDP Endpoints: 10.64.2.81:8081,10.64.3.93:8081 Session Affinity: None External Traffic Policy: Cluster Events: <none> Name: session-affinity-service Namespace: esipp-5143 Labels: <none> Annotations: <none> Selector: selector-360f354e-78fd-4b78-86c3-ba7795446c8e=true Type: NodePort IP Family Policy: SingleStack IP Families: IPv4 IP: 10.0.88.58 IPs: 10.0.88.58 Port: http 80/TCP TargetPort: 8083/TCP NodePort: http 32504/TCP Endpoints: 10.64.2.81:8083,10.64.3.93:8083 Port: udp 90/UDP TargetPort: 8081/UDP NodePort: udp 30123/UDP Endpoints: 10.64.2.81:8081,10.64.3.93:8081 Session Affinity: ClientIP External Traffic Policy: Cluster Events: <none> [DeferCleanup (Each)] [sig-network] LoadBalancers ESIPP [Slow] test/e2e/framework/metrics/init/init.go:33 [DeferCleanup (Each)] [sig-network] LoadBalancers ESIPP [Slow] dump namespaces | framework.go:196 STEP: dump namespace information after failure 11/26/22 18:49:03.877 STEP: Collecting events from namespace "esipp-5143". 11/26/22 18:49:03.877 STEP: Found 37 events. 11/26/22 18:49:03.93 Nov 26 18:49:03.931: INFO: At 0001-01-01 00:00:00 +0000 UTC - event for test-container-pod: { } Scheduled: Successfully assigned esipp-5143/test-container-pod to bootstrap-e2e-minion-group-dzls Nov 26 18:49:03.931: INFO: At 2022-11-26 18:46:03 +0000 UTC - event for external-local-update: {service-controller } EnsuringLoadBalancer: Ensuring load balancer Nov 26 18:49:03.931: INFO: At 2022-11-26 18:46:38 +0000 UTC - event for external-local-update: {service-controller } EnsuredLoadBalancer: Ensured load balancer Nov 26 18:49:03.931: INFO: At 2022-11-26 18:46:40 +0000 UTC - event for external-local-update: {replication-controller } SuccessfulCreate: Created pod: external-local-update-nfqlg Nov 26 18:49:03.931: INFO: At 2022-11-26 18:46:40 +0000 UTC - event for external-local-update-nfqlg: {kubelet bootstrap-e2e-minion-group-dzls} Pulled: Container image "registry.k8s.io/e2e-test-images/agnhost:2.43" already present on machine Nov 26 18:49:03.931: INFO: At 2022-11-26 18:46:40 +0000 UTC - event for external-local-update-nfqlg: {kubelet bootstrap-e2e-minion-group-dzls} Created: Created container netexec Nov 26 18:49:03.931: INFO: At 2022-11-26 18:46:40 +0000 UTC - event for external-local-update-nfqlg: {kubelet bootstrap-e2e-minion-group-dzls} Started: Started container netexec Nov 26 18:49:03.931: INFO: At 2022-11-26 18:46:40 +0000 UTC - event for external-local-update-nfqlg: {default-scheduler } Scheduled: Successfully assigned esipp-5143/external-local-update-nfqlg to bootstrap-e2e-minion-group-dzls Nov 26 18:49:03.931: INFO: At 2022-11-26 18:46:42 +0000 UTC - event for external-local-update: {service-controller } ExternalTrafficPolicy: Local -> Cluster Nov 26 18:49:03.931: INFO: At 2022-11-26 18:46:43 +0000 UTC - event for netserver-0: {default-scheduler } Scheduled: Successfully assigned esipp-5143/netserver-0 to bootstrap-e2e-minion-group-dzls Nov 26 18:49:03.931: INFO: At 2022-11-26 18:46:43 +0000 UTC - event for netserver-1: {default-scheduler } Scheduled: Successfully assigned esipp-5143/netserver-1 to bootstrap-e2e-minion-group-gnb8 Nov 26 18:49:03.931: INFO: At 2022-11-26 18:46:43 +0000 UTC - event for netserver-2: {default-scheduler } Scheduled: Successfully assigned esipp-5143/netserver-2 to bootstrap-e2e-minion-group-p1wq Nov 26 18:49:03.931: INFO: At 2022-11-26 18:46:44 +0000 UTC - event for netserver-0: {kubelet bootstrap-e2e-minion-group-dzls} Created: Created container webserver Nov 26 18:49:03.931: INFO: At 2022-11-26 18:46:44 +0000 UTC - event for netserver-0: {kubelet bootstrap-e2e-minion-group-dzls} Started: Started container webserver Nov 26 18:49:03.931: INFO: At 2022-11-26 18:46:44 +0000 UTC - event for netserver-0: {kubelet bootstrap-e2e-minion-group-dzls} Pulled: Container image "registry.k8s.io/e2e-test-images/agnhost:2.43" already present on machine Nov 26 18:49:03.931: INFO: At 2022-11-26 18:46:44 +0000 UTC - event for netserver-1: {kubelet bootstrap-e2e-minion-group-gnb8} Pulled: Container image "registry.k8s.io/e2e-test-images/agnhost:2.43" already present on machine Nov 26 18:49:03.931: INFO: At 2022-11-26 18:46:44 +0000 UTC - event for netserver-1: {kubelet bootstrap-e2e-minion-group-gnb8} Created: Created container webserver Nov 26 18:49:03.931: INFO: At 2022-11-26 18:46:44 +0000 UTC - event for netserver-1: {kubelet bootstrap-e2e-minion-group-gnb8} Started: Started container webserver Nov 26 18:49:03.931: INFO: At 2022-11-26 18:46:44 +0000 UTC - event for netserver-2: {kubelet bootstrap-e2e-minion-group-p1wq} Pulled: Container image "registry.k8s.io/e2e-test-images/agnhost:2.43" already present on machine Nov 26 18:49:03.931: INFO: At 2022-11-26 18:46:44 +0000 UTC - event for netserver-2: {kubelet bootstrap-e2e-minion-group-p1wq} Started: Started container webserver Nov 26 18:49:03.931: INFO: At 2022-11-26 18:46:44 +0000 UTC - event for netserver-2: {kubelet bootstrap-e2e-minion-group-p1wq} Created: Created container webserver Nov 26 18:49:03.931: INFO: At 2022-11-26 18:46:45 +0000 UTC - event for netserver-0: {kubelet bootstrap-e2e-minion-group-dzls} Killing: Stopping container webserver Nov 26 18:49:03.931: INFO: At 2022-11-26 18:46:46 +0000 UTC - event for netserver-0: {kubelet bootstrap-e2e-minion-group-dzls} SandboxChanged: Pod sandbox changed, it will be killed and re-created. Nov 26 18:49:03.931: INFO: At 2022-11-26 18:46:50 +0000 UTC - event for netserver-0: {kubelet bootstrap-e2e-minion-group-dzls} BackOff: Back-off restarting failed container webserver in pod netserver-0_esipp-5143(d9e679d0-2018-4d5e-9dae-7b0f8c64761e) Nov 26 18:49:03.931: INFO: At 2022-11-26 18:46:54 +0000 UTC - event for netserver-2: {kubelet bootstrap-e2e-minion-group-p1wq} Killing: Stopping container webserver Nov 26 18:49:03.931: INFO: At 2022-11-26 18:46:55 +0000 UTC - event for netserver-2: {kubelet bootstrap-e2e-minion-group-p1wq} SandboxChanged: Pod sandbox changed, it will be killed and re-created. Nov 26 18:49:03.931: INFO: At 2022-11-26 18:46:58 +0000 UTC - event for netserver-2: {kubelet bootstrap-e2e-minion-group-p1wq} BackOff: Back-off restarting failed container webserver in pod netserver-2_esipp-5143(69636f58-7e33-4dfe-82a4-5303d941805f) Nov 26 18:49:03.931: INFO: At 2022-11-26 18:47:03 +0000 UTC - event for netserver-1: {kubelet bootstrap-e2e-minion-group-gnb8} Killing: Stopping container webserver Nov 26 18:49:03.931: INFO: At 2022-11-26 18:47:04 +0000 UTC - event for netserver-1: {kubelet bootstrap-e2e-minion-group-gnb8} SandboxChanged: Pod sandbox changed, it will be killed and re-created. Nov 26 18:49:03.931: INFO: At 2022-11-26 18:48:20 +0000 UTC - event for test-container-pod: {kubelet bootstrap-e2e-minion-group-dzls} Pulled: Container image "registry.k8s.io/e2e-test-images/agnhost:2.43" already present on machine Nov 26 18:49:03.931: INFO: At 2022-11-26 18:48:20 +0000 UTC - event for test-container-pod: {kubelet bootstrap-e2e-minion-group-dzls} Created: Created container webserver Nov 26 18:49:03.931: INFO: At 2022-11-26 18:48:21 +0000 UTC - event for test-container-pod: {kubelet bootstrap-e2e-minion-group-dzls} Started: Started container webserver Nov 26 18:49:03.931: INFO: At 2022-11-26 18:48:22 +0000 UTC - event for test-container-pod: {kubelet bootstrap-e2e-minion-group-dzls} Killing: Stopping container webserver Nov 26 18:49:03.931: INFO: At 2022-11-26 18:48:23 +0000 UTC - event for test-container-pod: {kubelet bootstrap-e2e-minion-group-dzls} SandboxChanged: Pod sandbox changed, it will be killed and re-created. Nov 26 18:49:03.931: INFO: At 2022-11-26 18:48:26 +0000 UTC - event for test-container-pod: {kubelet bootstrap-e2e-minion-group-dzls} BackOff: Back-off restarting failed container webserver in pod test-container-pod_esipp-5143(1dd2d2d0-634c-4616-b8e6-570c8e73329d) Nov 26 18:49:03.931: INFO: At 2022-11-26 18:48:52 +0000 UTC - event for external-local-update: {service-controller } Type: LoadBalancer -> ClusterIP Nov 26 18:49:03.931: INFO: At 2022-11-26 18:48:58 +0000 UTC - event for external-local-update: {service-controller } DeletingLoadBalancer: Deleting load balancer Nov 26 18:49:03.979: INFO: POD NODE PHASE GRACE CONDITIONS Nov 26 18:49:03.979: INFO: external-local-update-nfqlg bootstrap-e2e-minion-group-dzls Running [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:46:40 +0000 UTC } {Ready True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:46:41 +0000 UTC } {ContainersReady True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:46:41 +0000 UTC } {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:46:40 +0000 UTC }] Nov 26 18:49:03.979: INFO: netserver-0 bootstrap-e2e-minion-group-dzls Running [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:46:43 +0000 UTC } {Ready True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:47:44 +0000 UTC } {ContainersReady True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:47:44 +0000 UTC } {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:46:43 +0000 UTC }] Nov 26 18:49:03.979: INFO: netserver-1 bootstrap-e2e-minion-group-gnb8 Running [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:46:43 +0000 UTC } {Ready True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:47:23 +0000 UTC } {ContainersReady True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:47:23 +0000 UTC } {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:46:43 +0000 UTC }] Nov 26 18:49:03.979: INFO: netserver-2 bootstrap-e2e-minion-group-p1wq Running [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:46:43 +0000 UTC } {Ready False 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:48:14 +0000 UTC ContainersNotReady containers with unready status: [webserver]} {ContainersReady False 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:48:14 +0000 UTC ContainersNotReady containers with unready status: [webserver]} {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:46:43 +0000 UTC }] Nov 26 18:49:03.979: INFO: test-container-pod bootstrap-e2e-minion-group-dzls Running [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:48:19 +0000 UTC } {Ready True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:48:42 +0000 UTC } {ContainersReady True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:48:42 +0000 UTC } {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:48:19 +0000 UTC }] Nov 26 18:49:03.979: INFO: Nov 26 18:49:04.319: INFO: Logging node info for node bootstrap-e2e-master Nov 26 18:49:04.377: INFO: Node Info: &Node{ObjectMeta:{bootstrap-e2e-master 3b91a491-10ed-470d-8d9a-7e47529f6987 4734 0 2022-11-26 18:39:29 +0000 UTC <nil> <nil> map[beta.kubernetes.io/arch:amd64 beta.kubernetes.io/instance-type:n1-standard-1 beta.kubernetes.io/os:linux cloud.google.com/metadata-proxy-ready:true failure-domain.beta.kubernetes.io/region:us-west1 failure-domain.beta.kubernetes.io/zone:us-west1-b kubernetes.io/arch:amd64 kubernetes.io/hostname:bootstrap-e2e-master kubernetes.io/os:linux node.kubernetes.io/instance-type:n1-standard-1 topology.kubernetes.io/region:us-west1 topology.kubernetes.io/zone:us-west1-b] map[node.alpha.kubernetes.io/ttl:0 volumes.kubernetes.io/controller-managed-attach-detach:true] [] [] [{kubelet Update v1 2022-11-26 18:39:29 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-attach-detach":{}},"f:labels":{".":{},"f:beta.kubernetes.io/arch":{},"f:beta.kubernetes.io/instance-type":{},"f:beta.kubernetes.io/os":{},"f:cloud.google.com/metadata-proxy-ready":{},"f:failure-domain.beta.kubernetes.io/region":{},"f:failure-domain.beta.kubernetes.io/zone":{},"f:kubernetes.io/arch":{},"f:kubernetes.io/hostname":{},"f:kubernetes.io/os":{},"f:node.kubernetes.io/instance-type":{},"f:topology.kubernetes.io/region":{},"f:topology.kubernetes.io/zone":{}}},"f:spec":{"f:providerID":{},"f:unschedulable":{}}} } {kube-controller-manager Update v1 2022-11-26 18:39:46 +0000 UTC FieldsV1 {"f:status":{"f:conditions":{"k:{\"type\":\"NetworkUnavailable\"}":{"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{}}}}} status} {kube-controller-manager Update v1 2022-11-26 18:39:49 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}},"f:spec":{"f:podCIDR":{},"f:podCIDRs":{".":{},"v:\"10.64.0.0/24\"":{}},"f:taints":{}}} } {kubelet Update v1 2022-11-26 18:45:48 +0000 UTC FieldsV1 {"f:status":{"f:conditions":{"k:{\"type\":\"DiskPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"MemoryPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"PIDPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"Ready\"}":{"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{}}},"f:images":{}}} status}]},Spec:NodeSpec{PodCIDR:10.64.0.0/24,DoNotUseExternalID:,ProviderID:gce://k8s-jkns-e2e-gke-ubuntu-slow/us-west1-b/bootstrap-e2e-master,Unschedulable:true,Taints:[]Taint{Taint{Key:node-role.kubernetes.io/master,Value:,Effect:NoSchedule,TimeAdded:<nil>,},Taint{Key:node.kubernetes.io/unschedulable,Value:,Effect:NoSchedule,TimeAdded:<nil>,},},ConfigSource:nil,PodCIDRs:[10.64.0.0/24],},Status:NodeStatus{Capacity:ResourceList{attachable-volumes-gce-pd: {{127 0} {<nil>} 127 DecimalSI},cpu: {{1 0} {<nil>} 1 DecimalSI},ephemeral-storage: {{16656896000 0} {<nil>} 16266500Ki BinarySI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{3858374656 0} {<nil>} BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Allocatable:ResourceList{attachable-volumes-gce-pd: {{127 0} {<nil>} 127 DecimalSI},cpu: {{1 0} {<nil>} 1 DecimalSI},ephemeral-storage: {{14991206376 0} {<nil>} 14991206376 DecimalSI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{3596230656 0} {<nil>} BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Phase:,Conditions:[]NodeCondition{NodeCondition{Type:NetworkUnavailable,Status:False,LastHeartbeatTime:2022-11-26 18:39:46 +0000 UTC,LastTransitionTime:2022-11-26 18:39:46 +0000 UTC,Reason:RouteCreated,Message:RouteController created a route,},NodeCondition{Type:MemoryPressure,Status:False,LastHeartbeatTime:2022-11-26 18:45:48 +0000 UTC,LastTransitionTime:2022-11-26 18:39:29 +0000 UTC,Reason:KubeletHasSufficientMemory,Message:kubelet has sufficient memory available,},NodeCondition{Type:DiskPressure,Status:False,LastHeartbeatTime:2022-11-26 18:45:48 +0000 UTC,LastTransitionTime:2022-11-26 18:39:29 +0000 UTC,Reason:KubeletHasNoDiskPressure,Message:kubelet has no disk pressure,},NodeCondition{Type:PIDPressure,Status:False,LastHeartbeatTime:2022-11-26 18:45:48 +0000 UTC,LastTransitionTime:2022-11-26 18:39:29 +0000 UTC,Reason:KubeletHasSufficientPID,Message:kubelet has sufficient PID available,},NodeCondition{Type:Ready,Status:True,LastHeartbeatTime:2022-11-26 18:45:48 +0000 UTC,LastTransitionTime:2022-11-26 18:39:48 +0000 UTC,Reason:KubeletReady,Message:kubelet is posting ready status. AppArmor enabled,},},Addresses:[]NodeAddress{NodeAddress{Type:InternalIP,Address:10.138.0.2,},NodeAddress{Type:ExternalIP,Address:34.83.88.61,},NodeAddress{Type:InternalDNS,Address:bootstrap-e2e-master.c.k8s-jkns-e2e-gke-ubuntu-slow.internal,},NodeAddress{Type:Hostname,Address:bootstrap-e2e-master.c.k8s-jkns-e2e-gke-ubuntu-slow.internal,},},DaemonEndpoints:NodeDaemonEndpoints{KubeletEndpoint:DaemonEndpoint{Port:10250,},},NodeInfo:NodeSystemInfo{MachineID:3cf1b58d8a4ce7f42c27588b99e07e59,SystemUUID:3cf1b58d-8a4c-e7f4-2c27-588b99e07e59,BootID:5ae5a72a-a603-4ca4-8f28-2f5024cd1d88,KernelVersion:5.10.123+,OSImage:Container-Optimized OS from Google,ContainerRuntimeVersion:containerd://1.7.0-beta.0-149-gd06318622,KubeletVersion:v1.27.0-alpha.0.50+70617042976dc1,KubeProxyVersion:v1.27.0-alpha.0.50+70617042976dc1,OperatingSystem:linux,Architecture:amd64,},Images:[]ContainerImage{ContainerImage{Names:[registry.k8s.io/kube-apiserver-amd64:v1.27.0-alpha.0.50_70617042976dc1],SizeBytes:135160272,},ContainerImage{Names:[registry.k8s.io/kube-controller-manager-amd64:v1.27.0-alpha.0.50_70617042976dc1],SizeBytes:124990265,},ContainerImage{Names:[registry.k8s.io/etcd@sha256:dd75ec974b0a2a6f6bb47001ba09207976e625db898d1b16735528c009cb171c registry.k8s.io/etcd:3.5.6-0],SizeBytes:102542580,},ContainerImage{Names:[registry.k8s.io/kube-scheduler-amd64:v1.27.0-alpha.0.50_70617042976dc1],SizeBytes:57660216,},ContainerImage{Names:[gke.gcr.io/prometheus-to-sd@sha256:e739643c3939ba0b161425f45a1989eedfc4a3b166db9a7100863296b4c70510 gke.gcr.io/prometheus-to-sd:v0.11.1-gke.1],SizeBytes:48742566,},ContainerImage{Names:[gcr.io/k8s-ingress-image-push/ingress-gce-glbc-amd64@sha256:5db27383add6d9f4ebdf0286409ac31f7f5d273690204b341a4e37998917693b gcr.io/k8s-ingress-image-push/ingress-gce-glbc-amd64:v1.20.1],SizeBytes:36598135,},ContainerImage{Names:[registry.k8s.io/addon-manager/kube-addon-manager@sha256:49cc4e6e4a3745b427ce14b0141476ab339bb65c6bc05033019e046c8727dcb0 registry.k8s.io/addon-manager/kube-addon-manager:v9.1.6],SizeBytes:30464183,},ContainerImage{Names:[registry.k8s.io/kas-network-proxy/proxy-server@sha256:2c111f004bec24888d8cfa2a812a38fb8341350abac67dcd0ac64e709dfe389c registry.k8s.io/kas-network-proxy/proxy-server:v0.0.33],SizeBytes:22020129,},ContainerImage{Names:[registry.k8s.io/metadata-proxy@sha256:e914645f22e946bce5165737e1b244e0a296ad1f0f81a9531adc57af2780978a registry.k8s.io/metadata-proxy:v0.1.12],SizeBytes:5301657,},ContainerImage{Names:[registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d registry.k8s.io/pause:3.8],SizeBytes:311286,},},VolumesInUse:[],VolumesAttached:[]AttachedVolume{},Config:nil,},} Nov 26 18:49:04.378: INFO: Logging kubelet events for node bootstrap-e2e-master Nov 26 18:49:04.429: INFO: Logging pods the kubelet thinks is on node bootstrap-e2e-master Nov 26 18:49:04.492: INFO: etcd-server-bootstrap-e2e-master started at 2022-11-26 18:38:46 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:04.492: INFO: Container etcd-container ready: true, restart count 1 Nov 26 18:49:04.492: INFO: l7-lb-controller-bootstrap-e2e-master started at 2022-11-26 18:39:03 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:04.492: INFO: Container l7-lb-controller ready: true, restart count 5 Nov 26 18:49:04.492: INFO: etcd-server-events-bootstrap-e2e-master started at 2022-11-26 18:38:46 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:04.492: INFO: Container etcd-container ready: true, restart count 0 Nov 26 18:49:04.492: INFO: kube-apiserver-bootstrap-e2e-master started at 2022-11-26 18:38:46 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:04.492: INFO: Container kube-apiserver ready: true, restart count 1 Nov 26 18:49:04.492: INFO: kube-controller-manager-bootstrap-e2e-master started at 2022-11-26 18:38:46 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:04.492: INFO: Container kube-controller-manager ready: true, restart count 4 Nov 26 18:49:04.492: INFO: kube-scheduler-bootstrap-e2e-master started at 2022-11-26 18:38:46 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:04.492: INFO: Container kube-scheduler ready: true, restart count 2 Nov 26 18:49:04.492: INFO: kube-addon-manager-bootstrap-e2e-master started at 2022-11-26 18:39:03 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:04.492: INFO: Container kube-addon-manager ready: true, restart count 2 Nov 26 18:49:04.492: INFO: metadata-proxy-v0.1-svjxn started at 2022-11-26 18:39:48 +0000 UTC (0+2 container statuses recorded) Nov 26 18:49:04.492: INFO: Container metadata-proxy ready: true, restart count 0 Nov 26 18:49:04.492: INFO: Container prometheus-to-sd-exporter ready: true, restart count 0 Nov 26 18:49:04.492: INFO: konnectivity-server-bootstrap-e2e-master started at 2022-11-26 18:38:46 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:04.492: INFO: Container konnectivity-server-container ready: true, restart count 3 Nov 26 18:49:04.791: INFO: Latency metrics for node bootstrap-e2e-master Nov 26 18:49:04.791: INFO: Logging node info for node bootstrap-e2e-minion-group-dzls Nov 26 18:49:04.882: INFO: Node Info: &Node{ObjectMeta:{bootstrap-e2e-minion-group-dzls 073571af-ef8e-4d0a-9070-5398dab26550 6419 0 2022-11-26 18:39:33 +0000 UTC <nil> <nil> map[beta.kubernetes.io/arch:amd64 beta.kubernetes.io/instance-type:n1-standard-2 beta.kubernetes.io/os:linux cloud.google.com/metadata-proxy-ready:true failure-domain.beta.kubernetes.io/region:us-west1 failure-domain.beta.kubernetes.io/zone:us-west1-b kubernetes.io/arch:amd64 kubernetes.io/hostname:bootstrap-e2e-minion-group-dzls kubernetes.io/os:linux node.kubernetes.io/instance-type:n1-standard-2 topology.hostpath.csi/node:bootstrap-e2e-minion-group-dzls topology.kubernetes.io/region:us-west1 topology.kubernetes.io/zone:us-west1-b] map[csi.volume.kubernetes.io/nodeid:{"csi-mock-csi-mock-volumes-2237":"bootstrap-e2e-minion-group-dzls","csi-mock-csi-mock-volumes-5240":"bootstrap-e2e-minion-group-dzls"} node.alpha.kubernetes.io/ttl:0 volumes.kubernetes.io/controller-managed-attach-detach:true] [] [] [{kube-controller-manager Update v1 2022-11-26 18:39:33 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}},"f:spec":{"f:podCIDR":{},"f:podCIDRs":{".":{},"v:\"10.64.3.0/24\"":{}}}} } {kubelet Update v1 2022-11-26 18:39:33 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-attach-detach":{}},"f:labels":{".":{},"f:beta.kubernetes.io/arch":{},"f:beta.kubernetes.io/instance-type":{},"f:beta.kubernetes.io/os":{},"f:cloud.google.com/metadata-proxy-ready":{},"f:failure-domain.beta.kubernetes.io/region":{},"f:failure-domain.beta.kubernetes.io/zone":{},"f:kubernetes.io/arch":{},"f:kubernetes.io/hostname":{},"f:kubernetes.io/os":{},"f:node.kubernetes.io/instance-type":{},"f:topology.kubernetes.io/region":{},"f:topology.kubernetes.io/zone":{}}},"f:spec":{"f:providerID":{}}} } {kube-controller-manager Update v1 2022-11-26 18:44:32 +0000 UTC FieldsV1 {"f:status":{"f:conditions":{"k:{\"type\":\"NetworkUnavailable\"}":{"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{}}}}} status} {node-problem-detector Update v1 2022-11-26 18:44:38 +0000 UTC FieldsV1 {"f:status":{"f:conditions":{"k:{\"type\":\"CorruptDockerOverlay2\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentContainerdRestart\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentDockerRestart\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentKubeletRestart\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentUnregisterNetDevice\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"KernelDeadlock\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"ReadonlyFilesystem\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}}}}} status} {kubelet Update v1 2022-11-26 18:48:41 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:csi.volume.kubernetes.io/nodeid":{}},"f:labels":{"f:topology.hostpath.csi/node":{}}},"f:status":{"f:conditions":{"k:{\"type\":\"DiskPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"MemoryPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"PIDPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"Ready\"}":{"f:lastHeartbeatTime":{},"f:message":{},"f:reason":{},"f:status":{}}},"f:images":{}}} status}]},Spec:NodeSpec{PodCIDR:10.64.3.0/24,DoNotUseExternalID:,ProviderID:gce://k8s-jkns-e2e-gke-ubuntu-slow/us-west1-b/bootstrap-e2e-minion-group-dzls,Unschedulable:false,Taints:[]Taint{},ConfigSource:nil,PodCIDRs:[10.64.3.0/24],},Status:NodeStatus{Capacity:ResourceList{attachable-volumes-gce-pd: {{127 0} {<nil>} 127 DecimalSI},cpu: {{2 0} {<nil>} 2 DecimalSI},ephemeral-storage: {{101203873792 0} {<nil>} 98831908Ki BinarySI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{7815438336 0} {<nil>} BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Allocatable:ResourceList{attachable-volumes-gce-pd: {{127 0} {<nil>} 127 DecimalSI},cpu: {{2 0} {<nil>} 2 DecimalSI},ephemeral-storage: {{91083486262 0} {<nil>} 91083486262 DecimalSI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{7553294336 0} {<nil>} BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Phase:,Conditions:[]NodeCondition{NodeCondition{Type:FrequentContainerdRestart,Status:False,LastHeartbeatTime:2022-11-26 18:44:38 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentContainerdRestart,Message:containerd is functioning properly,},NodeCondition{Type:KernelDeadlock,Status:False,LastHeartbeatTime:2022-11-26 18:44:38 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:KernelHasNoDeadlock,Message:kernel has no deadlock,},NodeCondition{Type:ReadonlyFilesystem,Status:False,LastHeartbeatTime:2022-11-26 18:44:38 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:FilesystemIsNotReadOnly,Message:Filesystem is not read-only,},NodeCondition{Type:CorruptDockerOverlay2,Status:False,LastHeartbeatTime:2022-11-26 18:44:38 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoCorruptDockerOverlay2,Message:docker overlay2 is functioning properly,},NodeCondition{Type:FrequentUnregisterNetDevice,Status:False,LastHeartbeatTime:2022-11-26 18:44:38 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentUnregisterNetDevice,Message:node is functioning properly,},NodeCondition{Type:FrequentKubeletRestart,Status:False,LastHeartbeatTime:2022-11-26 18:44:38 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentKubeletRestart,Message:kubelet is functioning properly,},NodeCondition{Type:FrequentDockerRestart,Status:False,LastHeartbeatTime:2022-11-26 18:44:38 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentDockerRestart,Message:docker is functioning properly,},NodeCondition{Type:NetworkUnavailable,Status:False,LastHeartbeatTime:2022-11-26 18:39:46 +0000 UTC,LastTransitionTime:2022-11-26 18:39:46 +0000 UTC,Reason:RouteCreated,Message:RouteController created a route,},NodeCondition{Type:MemoryPressure,Status:False,LastHeartbeatTime:2022-11-26 18:46:41 +0000 UTC,LastTransitionTime:2022-11-26 18:39:33 +0000 UTC,Reason:KubeletHasSufficientMemory,Message:kubelet has sufficient memory available,},NodeCondition{Type:DiskPressure,Status:False,LastHeartbeatTime:2022-11-26 18:46:41 +0000 UTC,LastTransitionTime:2022-11-26 18:39:33 +0000 UTC,Reason:KubeletHasNoDiskPressure,Message:kubelet has no disk pressure,},NodeCondition{Type:PIDPressure,Status:False,LastHeartbeatTime:2022-11-26 18:46:41 +0000 UTC,LastTransitionTime:2022-11-26 18:39:33 +0000 UTC,Reason:KubeletHasSufficientPID,Message:kubelet has sufficient PID available,},NodeCondition{Type:Ready,Status:True,LastHeartbeatTime:2022-11-26 18:46:41 +0000 UTC,LastTransitionTime:2022-11-26 18:39:33 +0000 UTC,Reason:KubeletReady,Message:kubelet is posting ready status. AppArmor enabled,},},Addresses:[]NodeAddress{NodeAddress{Type:InternalIP,Address:10.138.0.5,},NodeAddress{Type:ExternalIP,Address:34.168.240.199,},NodeAddress{Type:InternalDNS,Address:bootstrap-e2e-minion-group-dzls.c.k8s-jkns-e2e-gke-ubuntu-slow.internal,},NodeAddress{Type:Hostname,Address:bootstrap-e2e-minion-group-dzls.c.k8s-jkns-e2e-gke-ubuntu-slow.internal,},},DaemonEndpoints:NodeDaemonEndpoints{KubeletEndpoint:DaemonEndpoint{Port:10250,},},NodeInfo:NodeSystemInfo{MachineID:8acf528984519ae32bb25015f71e5ef3,SystemUUID:8acf5289-8451-9ae3-2bb2-5015f71e5ef3,BootID:5e47f3ec-cd63-4220-9ac4-8feb7752a952,KernelVersion:5.10.123+,OSImage:Container-Optimized OS from Google,ContainerRuntimeVersion:containerd://1.7.0-beta.0-149-gd06318622,KubeletVersion:v1.27.0-alpha.0.50+70617042976dc1,KubeProxyVersion:v1.27.0-alpha.0.50+70617042976dc1,OperatingSystem:linux,Architecture:amd64,},Images:[]ContainerImage{ContainerImage{Names:[registry.k8s.io/e2e-test-images/jessie-dnsutils@sha256:24aaf2626d6b27864c29de2097e8bbb840b3a414271bf7c8995e431e47d8408e registry.k8s.io/e2e-test-images/jessie-dnsutils:1.7],SizeBytes:112030336,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/volume/nfs@sha256:3bda73f2428522b0e342af80a0b9679e8594c2126f2b3cca39ed787589741b9e registry.k8s.io/e2e-test-images/volume/nfs:1.3],SizeBytes:95836203,},ContainerImage{Names:[registry.k8s.io/sig-storage/nfs-provisioner@sha256:e943bb77c7df05ebdc8c7888b2db289b13bf9f012d6a3a5a74f14d4d5743d439 registry.k8s.io/sig-storage/nfs-provisioner:v3.0.1],SizeBytes:90632047,},ContainerImage{Names:[registry.k8s.io/kube-proxy-amd64:v1.27.0-alpha.0.50_70617042976dc1],SizeBytes:67201736,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/agnhost@sha256:16bbf38c463a4223d8cfe4da12bc61010b082a79b4bb003e2d3ba3ece5dd5f9e registry.k8s.io/e2e-test-images/agnhost:2.43],SizeBytes:51706353,},ContainerImage{Names:[gke.gcr.io/prometheus-to-sd@sha256:e739643c3939ba0b161425f45a1989eedfc4a3b166db9a7100863296b4c70510 gke.gcr.io/prometheus-to-sd:v0.11.1-gke.1],SizeBytes:48742566,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/httpd@sha256:148b022f5c5da426fc2f3c14b5c0867e58ef05961510c84749ac1fddcb0fef22 registry.k8s.io/e2e-test-images/httpd:2.4.38-4],SizeBytes:40764257,},ContainerImage{Names:[registry.k8s.io/metrics-server/metrics-server@sha256:6385aec64bb97040a5e692947107b81e178555c7a5b71caa90d733e4130efc10 registry.k8s.io/metrics-server/metrics-server:v0.5.2],SizeBytes:26023008,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-provisioner@sha256:ee3b525d5b89db99da3b8eb521d9cd90cb6e9ef0fbb651e98bb37be78d36b5b8 registry.k8s.io/sig-storage/csi-provisioner:v3.3.0],SizeBytes:25491225,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-attacher@sha256:9a685020911e2725ad019dbce6e4a5ab93d51e3d4557f115e64343345e05781b registry.k8s.io/sig-storage/csi-attacher:v4.0.0],SizeBytes:23847201,},ContainerImage{Names:[registry.k8s.io/sig-storage/snapshot-controller@sha256:823c75d0c45d1427f6d850070956d9ca657140a7bbf828381541d1d808475280 registry.k8s.io/sig-storage/snapshot-controller:v6.1.0],SizeBytes:22620891,},ContainerImage{Names:[registry.k8s.io/sig-storage/hostpathplugin@sha256:92257881c1d6493cf18299a24af42330f891166560047902b8d431fb66b01af5 registry.k8s.io/sig-storage/hostpathplugin:v1.9.0],SizeBytes:18758628,},ContainerImage{Names:[registry.k8s.io/cpa/cluster-proportional-autoscaler@sha256:fd636b33485c7826fb20ef0688a83ee0910317dbb6c0c6f3ad14661c1db25def registry.k8s.io/cpa/cluster-proportional-autoscaler:1.8.4],SizeBytes:15209393,},ContainerImage{Names:[registry.k8s.io/coredns/coredns@sha256:8e352a029d304ca7431c6507b56800636c321cb52289686a581ab70aaa8a2e2a registry.k8s.io/coredns/coredns:v1.9.3],SizeBytes:14837849,},ContainerImage{Names:[registry.k8s.io/autoscaling/addon-resizer@sha256:43f129b81d28f0fdd54de6d8e7eacd5728030782e03db16087fc241ad747d3d6 registry.k8s.io/autoscaling/addon-resizer:1.8.14],SizeBytes:10153852,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-node-driver-registrar@sha256:0103eee7c35e3e0b5cd8cdca9850dc71c793cdeb6669d8be7a89440da2d06ae4 registry.k8s.io/sig-storage/csi-node-driver-registrar:v2.5.1],SizeBytes:9133109,},ContainerImage{Names:[registry.k8s.io/kas-network-proxy/proxy-agent@sha256:48f2a4ec3e10553a81b8dd1c6fa5fe4bcc9617f78e71c1ca89c6921335e2d7da registry.k8s.io/kas-network-proxy/proxy-agent:v0.0.33],SizeBytes:8512162,},ContainerImage{Names:[registry.k8s.io/networking/ingress-gce-404-server-with-metrics-amd64@sha256:7eb7b3cee4d33c10c49893ad3c386232b86d4067de5251294d4c620d6e072b93 registry.k8s.io/networking/ingress-gce-404-server-with-metrics-amd64:v1.10.11],SizeBytes:6463068,},ContainerImage{Names:[registry.k8s.io/metadata-proxy@sha256:e914645f22e946bce5165737e1b244e0a296ad1f0f81a9531adc57af2780978a registry.k8s.io/metadata-proxy:v0.1.12],SizeBytes:5301657,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/busybox@sha256:2e0f836850e09b8b7cc937681d6194537a09fbd5f6b9e08f4d646a85128e8937 registry.k8s.io/e2e-test-images/busybox:1.29-4],SizeBytes:731990,},ContainerImage{Names:[registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097 registry.k8s.io/pause:3.9],SizeBytes:321520,},ContainerImage{Names:[registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d registry.k8s.io/pause:3.8],SizeBytes:311286,},},VolumesInUse:[],VolumesAttached:[]AttachedVolume{},Config:nil,},} Nov 26 18:49:04.882: INFO: Logging kubelet events for node bootstrap-e2e-minion-group-dzls Nov 26 18:49:04.934: INFO: Logging pods the kubelet thinks is on node bootstrap-e2e-minion-group-dzls Nov 26 18:49:05.078: INFO: l7-default-backend-8549d69d99-wtn4d started at 2022-11-26 18:39:46 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.078: INFO: Container default-http-backend ready: true, restart count 0 Nov 26 18:49:05.078: INFO: affinity-lb-esipp-transition-brzbx started at 2022-11-26 18:45:17 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.078: INFO: Container affinity-lb-esipp-transition ready: true, restart count 1 Nov 26 18:49:05.078: INFO: pvc-volume-tester-nj8zb started at 2022-11-26 18:44:17 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.078: INFO: Container volume-tester ready: false, restart count 0 Nov 26 18:49:05.078: INFO: pvc-volume-tester-9phjk started at 2022-11-26 18:48:38 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.078: INFO: Container volume-tester ready: false, restart count 0 Nov 26 18:49:05.078: INFO: execpod-acceptwm2nm started at 2022-11-26 18:41:50 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.078: INFO: Container agnhost-container ready: true, restart count 1 Nov 26 18:49:05.078: INFO: lb-sourcerange-b4r2c started at 2022-11-26 18:42:06 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.078: INFO: Container netexec ready: true, restart count 3 Nov 26 18:49:05.078: INFO: mutability-test-26zlk started at 2022-11-26 18:47:41 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.078: INFO: Container netexec ready: true, restart count 2 Nov 26 18:49:05.078: INFO: csi-mockplugin-0 started at 2022-11-26 18:48:36 +0000 UTC (0+3 container statuses recorded) Nov 26 18:49:05.078: INFO: Container csi-provisioner ready: true, restart count 1 Nov 26 18:49:05.078: INFO: Container driver-registrar ready: true, restart count 1 Nov 26 18:49:05.078: INFO: Container mock ready: true, restart count 1 Nov 26 18:49:05.078: INFO: metadata-proxy-v0.1-zcpfg started at 2022-11-26 18:39:33 +0000 UTC (0+2 container statuses recorded) Nov 26 18:49:05.078: INFO: Container metadata-proxy ready: true, restart count 0 Nov 26 18:49:05.078: INFO: Container prometheus-to-sd-exporter ready: true, restart count 0 Nov 26 18:49:05.078: INFO: konnectivity-agent-jfgxg started at 2022-11-26 18:39:46 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.078: INFO: Container konnectivity-agent ready: false, restart count 5 Nov 26 18:49:05.078: INFO: netserver-0 started at 2022-11-26 18:46:43 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.078: INFO: Container webserver ready: true, restart count 3 Nov 26 18:49:05.078: INFO: coredns-6d97d5ddb-l6khp started at 2022-11-26 18:39:46 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.078: INFO: Container coredns ready: false, restart count 5 Nov 26 18:49:05.078: INFO: volume-snapshot-controller-0 started at 2022-11-26 18:39:46 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.078: INFO: Container volume-snapshot-controller ready: true, restart count 4 Nov 26 18:49:05.078: INFO: external-local-nodes-z9qgt started at 2022-11-26 18:45:27 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.078: INFO: Container netexec ready: false, restart count 5 Nov 26 18:49:05.078: INFO: nfs-server started at 2022-11-26 18:46:10 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.078: INFO: Container nfs-server ready: true, restart count 2 Nov 26 18:49:05.078: INFO: pod-back-off-image started at 2022-11-26 18:48:36 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.078: INFO: Container back-off ready: false, restart count 1 Nov 26 18:49:05.078: INFO: kube-dns-autoscaler-5f6455f985-rcp87 started at 2022-11-26 18:39:46 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.078: INFO: Container autoscaler ready: false, restart count 4 Nov 26 18:49:05.078: INFO: net-tiers-svc-lp9r4 started at 2022-11-26 18:41:49 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.078: INFO: Container netexec ready: false, restart count 3 Nov 26 18:49:05.078: INFO: csi-mockplugin-attacher-0 started at 2022-11-26 18:44:06 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.078: INFO: Container csi-attacher ready: true, restart count 3 Nov 26 18:49:05.078: INFO: netserver-0 started at 2022-11-26 18:44:09 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.078: INFO: Container webserver ready: true, restart count 4 Nov 26 18:49:05.078: INFO: pvc-volume-tester-ptk74 started at 2022-11-26 18:44:26 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.078: INFO: Container volume-tester ready: false, restart count 0 Nov 26 18:49:05.078: INFO: external-local-update-nfqlg started at 2022-11-26 18:46:40 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.078: INFO: Container netexec ready: true, restart count 0 Nov 26 18:49:05.078: INFO: test-container-pod started at 2022-11-26 18:45:22 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.078: INFO: Container webserver ready: true, restart count 0 Nov 26 18:49:05.078: INFO: test-container-pod started at 2022-11-26 18:48:19 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.078: INFO: Container webserver ready: true, restart count 2 Nov 26 18:49:05.078: INFO: kube-proxy-bootstrap-e2e-minion-group-dzls started at 2022-11-26 18:39:33 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.078: INFO: Container kube-proxy ready: false, restart count 5 Nov 26 18:49:05.078: INFO: csi-mockplugin-0 started at 2022-11-26 18:44:05 +0000 UTC (0+3 container statuses recorded) Nov 26 18:49:05.078: INFO: Container csi-provisioner ready: true, restart count 0 Nov 26 18:49:05.078: INFO: Container driver-registrar ready: true, restart count 0 Nov 26 18:49:05.078: INFO: Container mock ready: true, restart count 0 Nov 26 18:49:05.078: INFO: lb-internal-fmnnv started at 2022-11-26 18:47:14 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.078: INFO: Container netexec ready: false, restart count 1 Nov 26 18:49:05.380: INFO: Latency metrics for node bootstrap-e2e-minion-group-dzls Nov 26 18:49:05.380: INFO: Logging node info for node bootstrap-e2e-minion-group-gnb8 Nov 26 18:49:05.426: INFO: Node Info: &Node{ObjectMeta:{bootstrap-e2e-minion-group-gnb8 d034f9b4-7cc6-4262-a4af-02040c7b7abd 6450 0 2022-11-26 18:39:33 +0000 UTC <nil> <nil> map[beta.kubernetes.io/arch:amd64 beta.kubernetes.io/instance-type:n1-standard-2 beta.kubernetes.io/os:linux cloud.google.com/metadata-proxy-ready:true failure-domain.beta.kubernetes.io/region:us-west1 failure-domain.beta.kubernetes.io/zone:us-west1-b kubernetes.io/arch:amd64 kubernetes.io/hostname:bootstrap-e2e-minion-group-gnb8 kubernetes.io/os:linux node.kubernetes.io/instance-type:n1-standard-2 topology.hostpath.csi/node:bootstrap-e2e-minion-group-gnb8 topology.kubernetes.io/region:us-west1 topology.kubernetes.io/zone:us-west1-b] map[csi.volume.kubernetes.io/nodeid:{"csi-hostpath-multivolume-3170":"bootstrap-e2e-minion-group-gnb8","csi-hostpath-provisioning-1566":"bootstrap-e2e-minion-group-gnb8","csi-mock-csi-mock-volumes-9058":"csi-mock-csi-mock-volumes-9058"} node.alpha.kubernetes.io/ttl:0 volumes.kubernetes.io/controller-managed-attach-detach:true] [] [] [{kube-controller-manager Update v1 2022-11-26 18:39:33 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}},"f:spec":{"f:podCIDR":{},"f:podCIDRs":{".":{},"v:\"10.64.2.0/24\"":{}}}} } {kubelet Update v1 2022-11-26 18:39:33 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-attach-detach":{}},"f:labels":{".":{},"f:beta.kubernetes.io/arch":{},"f:beta.kubernetes.io/instance-type":{},"f:beta.kubernetes.io/os":{},"f:cloud.google.com/metadata-proxy-ready":{},"f:failure-domain.beta.kubernetes.io/region":{},"f:failure-domain.beta.kubernetes.io/zone":{},"f:kubernetes.io/arch":{},"f:kubernetes.io/hostname":{},"f:kubernetes.io/os":{},"f:node.kubernetes.io/instance-type":{},"f:topology.kubernetes.io/region":{},"f:topology.kubernetes.io/zone":{}}},"f:spec":{"f:providerID":{}}} } {node-problem-detector Update v1 2022-11-26 18:44:37 +0000 UTC FieldsV1 {"f:status":{"f:conditions":{"k:{\"type\":\"CorruptDockerOverlay2\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentContainerdRestart\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentDockerRestart\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentKubeletRestart\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentUnregisterNetDevice\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"KernelDeadlock\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"ReadonlyFilesystem\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}}}}} status} {kube-controller-manager Update v1 2022-11-26 18:45:11 +0000 UTC FieldsV1 {"f:status":{"f:conditions":{"k:{\"type\":\"NetworkUnavailable\"}":{"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{}}}}} status} {kubelet Update v1 2022-11-26 18:48:46 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:csi.volume.kubernetes.io/nodeid":{}},"f:labels":{"f:topology.hostpath.csi/node":{}}},"f:status":{"f:conditions":{"k:{\"type\":\"DiskPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"MemoryPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"PIDPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"Ready\"}":{"f:lastHeartbeatTime":{},"f:message":{},"f:reason":{},"f:status":{}}},"f:images":{}}} status}]},Spec:NodeSpec{PodCIDR:10.64.2.0/24,DoNotUseExternalID:,ProviderID:gce://k8s-jkns-e2e-gke-ubuntu-slow/us-west1-b/bootstrap-e2e-minion-group-gnb8,Unschedulable:false,Taints:[]Taint{},ConfigSource:nil,PodCIDRs:[10.64.2.0/24],},Status:NodeStatus{Capacity:ResourceList{attachable-volumes-gce-pd: {{127 0} {<nil>} 127 DecimalSI},cpu: {{2 0} {<nil>} 2 DecimalSI},ephemeral-storage: {{101203873792 0} {<nil>} 98831908Ki BinarySI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{7815438336 0} {<nil>} BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Allocatable:ResourceList{attachable-volumes-gce-pd: {{127 0} {<nil>} 127 DecimalSI},cpu: {{2 0} {<nil>} 2 DecimalSI},ephemeral-storage: {{91083486262 0} {<nil>} 91083486262 DecimalSI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{7553294336 0} {<nil>} BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Phase:,Conditions:[]NodeCondition{NodeCondition{Type:FrequentKubeletRestart,Status:False,LastHeartbeatTime:2022-11-26 18:44:37 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentKubeletRestart,Message:kubelet is functioning properly,},NodeCondition{Type:FrequentDockerRestart,Status:False,LastHeartbeatTime:2022-11-26 18:44:37 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentDockerRestart,Message:docker is functioning properly,},NodeCondition{Type:FrequentContainerdRestart,Status:False,LastHeartbeatTime:2022-11-26 18:44:37 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentContainerdRestart,Message:containerd is functioning properly,},NodeCondition{Type:CorruptDockerOverlay2,Status:False,LastHeartbeatTime:2022-11-26 18:44:37 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoCorruptDockerOverlay2,Message:docker overlay2 is functioning properly,},NodeCondition{Type:KernelDeadlock,Status:False,LastHeartbeatTime:2022-11-26 18:44:37 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:KernelHasNoDeadlock,Message:kernel has no deadlock,},NodeCondition{Type:ReadonlyFilesystem,Status:False,LastHeartbeatTime:2022-11-26 18:44:37 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:FilesystemIsNotReadOnly,Message:Filesystem is not read-only,},NodeCondition{Type:FrequentUnregisterNetDevice,Status:False,LastHeartbeatTime:2022-11-26 18:44:37 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentUnregisterNetDevice,Message:node is functioning properly,},NodeCondition{Type:NetworkUnavailable,Status:False,LastHeartbeatTime:2022-11-26 18:39:46 +0000 UTC,LastTransitionTime:2022-11-26 18:39:46 +0000 UTC,Reason:RouteCreated,Message:RouteController created a route,},NodeCondition{Type:MemoryPressure,Status:False,LastHeartbeatTime:2022-11-26 18:47:04 +0000 UTC,LastTransitionTime:2022-11-26 18:39:33 +0000 UTC,Reason:KubeletHasSufficientMemory,Message:kubelet has sufficient memory available,},NodeCondition{Type:DiskPressure,Status:False,LastHeartbeatTime:2022-11-26 18:47:04 +0000 UTC,LastTransitionTime:2022-11-26 18:39:33 +0000 UTC,Reason:KubeletHasNoDiskPressure,Message:kubelet has no disk pressure,},NodeCondition{Type:PIDPressure,Status:False,LastHeartbeatTime:2022-11-26 18:47:04 +0000 UTC,LastTransitionTime:2022-11-26 18:39:33 +0000 UTC,Reason:KubeletHasSufficientPID,Message:kubelet has sufficient PID available,},NodeCondition{Type:Ready,Status:True,LastHeartbeatTime:2022-11-26 18:47:04 +0000 UTC,LastTransitionTime:2022-11-26 18:39:33 +0000 UTC,Reason:KubeletReady,Message:kubelet is posting ready status. AppArmor enabled,},},Addresses:[]NodeAddress{NodeAddress{Type:InternalIP,Address:10.138.0.4,},NodeAddress{Type:ExternalIP,Address:34.83.41.63,},NodeAddress{Type:InternalDNS,Address:bootstrap-e2e-minion-group-gnb8.c.k8s-jkns-e2e-gke-ubuntu-slow.internal,},NodeAddress{Type:Hostname,Address:bootstrap-e2e-minion-group-gnb8.c.k8s-jkns-e2e-gke-ubuntu-slow.internal,},},DaemonEndpoints:NodeDaemonEndpoints{KubeletEndpoint:DaemonEndpoint{Port:10250,},},NodeInfo:NodeSystemInfo{MachineID:975025d51ee8dddd080415bf76b18e9f,SystemUUID:975025d5-1ee8-dddd-0804-15bf76b18e9f,BootID:78ba7c41-cb6b-4d0e-9530-01248f3478cf,KernelVersion:5.10.123+,OSImage:Container-Optimized OS from Google,ContainerRuntimeVersion:containerd://1.7.0-beta.0-149-gd06318622,KubeletVersion:v1.27.0-alpha.0.50+70617042976dc1,KubeProxyVersion:v1.27.0-alpha.0.50+70617042976dc1,OperatingSystem:linux,Architecture:amd64,},Images:[]ContainerImage{ContainerImage{Names:[registry.k8s.io/e2e-test-images/jessie-dnsutils@sha256:24aaf2626d6b27864c29de2097e8bbb840b3a414271bf7c8995e431e47d8408e registry.k8s.io/e2e-test-images/jessie-dnsutils:1.7],SizeBytes:112030336,},ContainerImage{Names:[registry.k8s.io/sig-storage/nfs-provisioner@sha256:e943bb77c7df05ebdc8c7888b2db289b13bf9f012d6a3a5a74f14d4d5743d439 registry.k8s.io/sig-storage/nfs-provisioner:v3.0.1],SizeBytes:90632047,},ContainerImage{Names:[registry.k8s.io/kube-proxy-amd64:v1.27.0-alpha.0.50_70617042976dc1],SizeBytes:67201736,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/agnhost@sha256:16bbf38c463a4223d8cfe4da12bc61010b082a79b4bb003e2d3ba3ece5dd5f9e registry.k8s.io/e2e-test-images/agnhost:2.43],SizeBytes:51706353,},ContainerImage{Names:[gke.gcr.io/prometheus-to-sd@sha256:e739643c3939ba0b161425f45a1989eedfc4a3b166db9a7100863296b4c70510 gke.gcr.io/prometheus-to-sd:v0.11.1-gke.1],SizeBytes:48742566,},ContainerImage{Names:[registry.k8s.io/metrics-server/metrics-server@sha256:6385aec64bb97040a5e692947107b81e178555c7a5b71caa90d733e4130efc10 registry.k8s.io/metrics-server/metrics-server:v0.5.2],SizeBytes:26023008,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-provisioner@sha256:ee3b525d5b89db99da3b8eb521d9cd90cb6e9ef0fbb651e98bb37be78d36b5b8 registry.k8s.io/sig-storage/csi-provisioner:v3.3.0],SizeBytes:25491225,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-resizer@sha256:425d8f1b769398127767b06ed97ce62578a3179bcb99809ce93a1649e025ffe7 registry.k8s.io/sig-storage/csi-resizer:v1.6.0],SizeBytes:24148884,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-snapshotter@sha256:291334908ddf71a4661fd7f6d9d97274de8a5378a2b6fdfeb2ce73414a34f82f registry.k8s.io/sig-storage/csi-snapshotter:v6.1.0],SizeBytes:23881995,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-attacher@sha256:9a685020911e2725ad019dbce6e4a5ab93d51e3d4557f115e64343345e05781b registry.k8s.io/sig-storage/csi-attacher:v4.0.0],SizeBytes:23847201,},ContainerImage{Names:[registry.k8s.io/sig-storage/hostpathplugin@sha256:92257881c1d6493cf18299a24af42330f891166560047902b8d431fb66b01af5 registry.k8s.io/sig-storage/hostpathplugin:v1.9.0],SizeBytes:18758628,},ContainerImage{Names:[registry.k8s.io/autoscaling/addon-resizer@sha256:43f129b81d28f0fdd54de6d8e7eacd5728030782e03db16087fc241ad747d3d6 registry.k8s.io/autoscaling/addon-resizer:1.8.14],SizeBytes:10153852,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-node-driver-registrar@sha256:0103eee7c35e3e0b5cd8cdca9850dc71c793cdeb6669d8be7a89440da2d06ae4 registry.k8s.io/sig-storage/csi-node-driver-registrar:v2.5.1],SizeBytes:9133109,},ContainerImage{Names:[registry.k8s.io/sig-storage/livenessprobe@sha256:933940f13b3ea0abc62e656c1aa5c5b47c04b15d71250413a6b821bd0c58b94e registry.k8s.io/sig-storage/livenessprobe:v2.7.0],SizeBytes:8688564,},ContainerImage{Names:[registry.k8s.io/kas-network-proxy/proxy-agent@sha256:48f2a4ec3e10553a81b8dd1c6fa5fe4bcc9617f78e71c1ca89c6921335e2d7da registry.k8s.io/kas-network-proxy/proxy-agent:v0.0.33],SizeBytes:8512162,},ContainerImage{Names:[registry.k8s.io/metadata-proxy@sha256:e914645f22e946bce5165737e1b244e0a296ad1f0f81a9531adc57af2780978a registry.k8s.io/metadata-proxy:v0.1.12],SizeBytes:5301657,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/busybox@sha256:c318242786b139d18676b1c09a0ad7f15fc17f8f16a5b2e625cd0dc8c9703daf registry.k8s.io/e2e-test-images/busybox:1.29-2],SizeBytes:732424,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/busybox@sha256:2e0f836850e09b8b7cc937681d6194537a09fbd5f6b9e08f4d646a85128e8937 registry.k8s.io/e2e-test-images/busybox:1.29-4],SizeBytes:731990,},ContainerImage{Names:[registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097 registry.k8s.io/pause:3.9],SizeBytes:321520,},ContainerImage{Names:[registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d registry.k8s.io/pause:3.8],SizeBytes:311286,},},VolumesInUse:[],VolumesAttached:[]AttachedVolume{},Config:nil,},} Nov 26 18:49:05.427: INFO: Logging kubelet events for node bootstrap-e2e-minion-group-gnb8 Nov 26 18:49:05.479: INFO: Logging pods the kubelet thinks is on node bootstrap-e2e-minion-group-gnb8 Nov 26 18:49:05.582: INFO: csi-hostpathplugin-0 started at 2022-11-26 18:44:49 +0000 UTC (0+7 container statuses recorded) Nov 26 18:49:05.582: INFO: Container csi-attacher ready: false, restart count 2 Nov 26 18:49:05.582: INFO: Container csi-provisioner ready: false, restart count 2 Nov 26 18:49:05.582: INFO: Container csi-resizer ready: false, restart count 2 Nov 26 18:49:05.582: INFO: Container csi-snapshotter ready: false, restart count 2 Nov 26 18:49:05.582: INFO: Container hostpath ready: false, restart count 2 Nov 26 18:49:05.582: INFO: Container liveness-probe ready: false, restart count 2 Nov 26 18:49:05.582: INFO: Container node-driver-registrar ready: false, restart count 2 Nov 26 18:49:05.582: INFO: hostexec-bootstrap-e2e-minion-group-gnb8-dk56q started at 2022-11-26 18:48:48 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.582: INFO: Container agnhost-container ready: true, restart count 1 Nov 26 18:49:05.582: INFO: execpod-dropfkgh8 started at 2022-11-26 18:41:56 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.582: INFO: Container agnhost-container ready: true, restart count 3 Nov 26 18:49:05.582: INFO: metadata-proxy-v0.1-c77zk started at 2022-11-26 18:39:33 +0000 UTC (0+2 container statuses recorded) Nov 26 18:49:05.582: INFO: Container metadata-proxy ready: true, restart count 0 Nov 26 18:49:05.582: INFO: Container prometheus-to-sd-exporter ready: true, restart count 0 Nov 26 18:49:05.582: INFO: csi-hostpathplugin-0 started at 2022-11-26 18:44:47 +0000 UTC (0+7 container statuses recorded) Nov 26 18:49:05.582: INFO: Container csi-attacher ready: true, restart count 0 Nov 26 18:49:05.582: INFO: Container csi-provisioner ready: true, restart count 0 Nov 26 18:49:05.582: INFO: Container csi-resizer ready: true, restart count 0 Nov 26 18:49:05.582: INFO: Container csi-snapshotter ready: true, restart count 0 Nov 26 18:49:05.582: INFO: Container hostpath ready: true, restart count 0 Nov 26 18:49:05.582: INFO: Container liveness-probe ready: true, restart count 0 Nov 26 18:49:05.582: INFO: Container node-driver-registrar ready: true, restart count 0 Nov 26 18:49:05.582: INFO: csi-hostpathplugin-0 started at 2022-11-26 18:43:33 +0000 UTC (0+7 container statuses recorded) Nov 26 18:49:05.582: INFO: Container csi-attacher ready: true, restart count 4 Nov 26 18:49:05.582: INFO: Container csi-provisioner ready: true, restart count 4 Nov 26 18:49:05.582: INFO: Container csi-resizer ready: true, restart count 4 Nov 26 18:49:05.582: INFO: Container csi-snapshotter ready: true, restart count 4 Nov 26 18:49:05.582: INFO: Container hostpath ready: true, restart count 4 Nov 26 18:49:05.582: INFO: Container liveness-probe ready: true, restart count 4 Nov 26 18:49:05.582: INFO: Container node-driver-registrar ready: true, restart count 4 Nov 26 18:49:05.582: INFO: kube-proxy-bootstrap-e2e-minion-group-gnb8 started at 2022-11-26 18:39:33 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.582: INFO: Container kube-proxy ready: false, restart count 5 Nov 26 18:49:05.582: INFO: metrics-server-v0.5.2-867b8754b9-5tc55 started at 2022-11-26 18:40:04 +0000 UTC (0+2 container statuses recorded) Nov 26 18:49:05.582: INFO: Container metrics-server ready: false, restart count 6 Nov 26 18:49:05.582: INFO: Container metrics-server-nanny ready: false, restart count 5 Nov 26 18:49:05.582: INFO: netserver-1 started at 2022-11-26 18:44:09 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.582: INFO: Container webserver ready: false, restart count 4 Nov 26 18:49:05.582: INFO: affinity-lb-esipp-transition-qkrbm started at 2022-11-26 18:45:17 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.582: INFO: Container affinity-lb-esipp-transition ready: true, restart count 0 Nov 26 18:49:05.582: INFO: netserver-1 started at 2022-11-26 18:46:43 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.582: INFO: Container webserver ready: true, restart count 1 Nov 26 18:49:05.582: INFO: pod-f6d9d5ce-4853-4468-80b6-8ed566c45975 started at 2022-11-26 18:49:01 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.582: INFO: Container write-pod ready: true, restart count 0 Nov 26 18:49:05.582: INFO: konnectivity-agent-rlpjb started at 2022-11-26 18:39:46 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:05.582: INFO: Container konnectivity-agent ready: true, restart count 5 Nov 26 18:49:05.582: INFO: csi-mockplugin-0 started at 2022-11-26 18:46:01 +0000 UTC (0+4 container statuses recorded) Nov 26 18:49:05.582: INFO: Container busybox ready: true, restart count 1 Nov 26 18:49:05.582: INFO: Container csi-provisioner ready: true, restart count 2 Nov 26 18:49:05.582: INFO: Container driver-registrar ready: true, restart count 1 Nov 26 18:49:05.582: INFO: Container mock ready: true, restart count 1 Nov 26 18:49:05.903: INFO: Latency metrics for node bootstrap-e2e-minion-group-gnb8 Nov 26 18:49:05.903: INFO: Logging node info for node bootstrap-e2e-minion-group-p1wq Nov 26 18:49:05.947: INFO: Node Info: &Node{ObjectMeta:{bootstrap-e2e-minion-group-p1wq e4a1a2e4-c0df-40e4-9f3f-bc50b8a3f74f 6534 0 2022-11-26 18:39:32 +0000 UTC <nil> <nil> map[beta.kubernetes.io/arch:amd64 beta.kubernetes.io/instance-type:n1-standard-2 beta.kubernetes.io/os:linux cloud.google.com/metadata-proxy-ready:true failure-domain.beta.kubernetes.io/region:us-west1 failure-domain.beta.kubernetes.io/zone:us-west1-b kubernetes.io/arch:amd64 kubernetes.io/hostname:bootstrap-e2e-minion-group-p1wq kubernetes.io/os:linux node.kubernetes.io/instance-type:n1-standard-2 topology.hostpath.csi/node:bootstrap-e2e-minion-group-p1wq topology.kubernetes.io/region:us-west1 topology.kubernetes.io/zone:us-west1-b] map[csi.volume.kubernetes.io/nodeid:{"csi-hostpath-provisioning-8805":"bootstrap-e2e-minion-group-p1wq","csi-mock-csi-mock-volumes-2715":"bootstrap-e2e-minion-group-p1wq"} node.alpha.kubernetes.io/ttl:0 volumes.kubernetes.io/controller-managed-attach-detach:true] [] [] [{kubelet Update v1 2022-11-26 18:39:32 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-attach-detach":{}},"f:labels":{".":{},"f:beta.kubernetes.io/arch":{},"f:beta.kubernetes.io/instance-type":{},"f:beta.kubernetes.io/os":{},"f:cloud.google.com/metadata-proxy-ready":{},"f:failure-domain.beta.kubernetes.io/region":{},"f:failure-domain.beta.kubernetes.io/zone":{},"f:kubernetes.io/arch":{},"f:kubernetes.io/hostname":{},"f:kubernetes.io/os":{},"f:node.kubernetes.io/instance-type":{},"f:topology.kubernetes.io/region":{},"f:topology.kubernetes.io/zone":{}}},"f:spec":{"f:providerID":{}}} } {kube-controller-manager Update v1 2022-11-26 18:39:33 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}},"f:spec":{"f:podCIDR":{},"f:podCIDRs":{".":{},"v:\"10.64.1.0/24\"":{}}}} } {node-problem-detector Update v1 2022-11-26 18:44:37 +0000 UTC FieldsV1 {"f:status":{"f:conditions":{"k:{\"type\":\"CorruptDockerOverlay2\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentContainerdRestart\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentDockerRestart\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentKubeletRestart\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentUnregisterNetDevice\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"KernelDeadlock\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"ReadonlyFilesystem\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}}}}} status} {kube-controller-manager Update v1 2022-11-26 18:47:00 +0000 UTC FieldsV1 {"f:status":{"f:conditions":{"k:{\"type\":\"NetworkUnavailable\"}":{"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{}}}}} status} {kubelet Update v1 2022-11-26 18:48:55 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:csi.volume.kubernetes.io/nodeid":{}},"f:labels":{"f:topology.hostpath.csi/node":{}}},"f:status":{"f:conditions":{"k:{\"type\":\"DiskPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"MemoryPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"PIDPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"Ready\"}":{"f:lastHeartbeatTime":{},"f:message":{},"f:reason":{},"f:status":{}}},"f:images":{}}} status}]},Spec:NodeSpec{PodCIDR:10.64.1.0/24,DoNotUseExternalID:,ProviderID:gce://k8s-jkns-e2e-gke-ubuntu-slow/us-west1-b/bootstrap-e2e-minion-group-p1wq,Unschedulable:false,Taints:[]Taint{},ConfigSource:nil,PodCIDRs:[10.64.1.0/24],},Status:NodeStatus{Capacity:ResourceList{attachable-volumes-gce-pd: {{127 0} {<nil>} 127 DecimalSI},cpu: {{2 0} {<nil>} 2 DecimalSI},ephemeral-storage: {{101203873792 0} {<nil>} 98831908Ki BinarySI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{7815438336 0} {<nil>} BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Allocatable:ResourceList{attachable-volumes-gce-pd: {{127 0} {<nil>} 127 DecimalSI},cpu: {{2 0} {<nil>} 2 DecimalSI},ephemeral-storage: {{91083486262 0} {<nil>} 91083486262 DecimalSI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{7553294336 0} {<nil>} BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Phase:,Conditions:[]NodeCondition{NodeCondition{Type:ReadonlyFilesystem,Status:False,LastHeartbeatTime:2022-11-26 18:44:37 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:FilesystemIsNotReadOnly,Message:Filesystem is not read-only,},NodeCondition{Type:FrequentUnregisterNetDevice,Status:False,LastHeartbeatTime:2022-11-26 18:44:37 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentUnregisterNetDevice,Message:node is functioning properly,},NodeCondition{Type:FrequentKubeletRestart,Status:False,LastHeartbeatTime:2022-11-26 18:44:37 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentKubeletRestart,Message:kubelet is functioning properly,},NodeCondition{Type:FrequentDockerRestart,Status:False,LastHeartbeatTime:2022-11-26 18:44:37 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentDockerRestart,Message:docker is functioning properly,},NodeCondition{Type:FrequentContainerdRestart,Status:False,LastHeartbeatTime:2022-11-26 18:44:37 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentContainerdRestart,Message:containerd is functioning properly,},NodeCondition{Type:CorruptDockerOverlay2,Status:False,LastHeartbeatTime:2022-11-26 18:44:37 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoCorruptDockerOverlay2,Message:docker overlay2 is functioning properly,},NodeCondition{Type:KernelDeadlock,Status:False,LastHeartbeatTime:2022-11-26 18:44:37 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:KernelHasNoDeadlock,Message:kernel has no deadlock,},NodeCondition{Type:NetworkUnavailable,Status:False,LastHeartbeatTime:2022-11-26 18:39:46 +0000 UTC,LastTransitionTime:2022-11-26 18:39:46 +0000 UTC,Reason:RouteCreated,Message:RouteController created a route,},NodeCondition{Type:MemoryPressure,Status:False,LastHeartbeatTime:2022-11-26 18:48:55 +0000 UTC,LastTransitionTime:2022-11-26 18:39:32 +0000 UTC,Reason:KubeletHasSufficientMemory,Message:kubelet has sufficient memory available,},NodeCondition{Type:DiskPressure,Status:False,LastHeartbeatTime:2022-11-26 18:48:55 +0000 UTC,LastTransitionTime:2022-11-26 18:39:32 +0000 UTC,Reason:KubeletHasNoDiskPressure,Message:kubelet has no disk pressure,},NodeCondition{Type:PIDPressure,Status:False,LastHeartbeatTime:2022-11-26 18:48:55 +0000 UTC,LastTransitionTime:2022-11-26 18:39:32 +0000 UTC,Reason:KubeletHasSufficientPID,Message:kubelet has sufficient PID available,},NodeCondition{Type:Ready,Status:True,LastHeartbeatTime:2022-11-26 18:48:55 +0000 UTC,LastTransitionTime:2022-11-26 18:39:32 +0000 UTC,Reason:KubeletReady,Message:kubelet is posting ready status. AppArmor enabled,},},Addresses:[]NodeAddress{NodeAddress{Type:InternalIP,Address:10.138.0.3,},NodeAddress{Type:ExternalIP,Address:34.168.33.242,},NodeAddress{Type:InternalDNS,Address:bootstrap-e2e-minion-group-p1wq.c.k8s-jkns-e2e-gke-ubuntu-slow.internal,},NodeAddress{Type:Hostname,Address:bootstrap-e2e-minion-group-p1wq.c.k8s-jkns-e2e-gke-ubuntu-slow.internal,},},DaemonEndpoints:NodeDaemonEndpoints{KubeletEndpoint:DaemonEndpoint{Port:10250,},},NodeInfo:NodeSystemInfo{MachineID:af52744010a8d3f305ac13b550d67291,SystemUUID:af527440-10a8-d3f3-05ac-13b550d67291,BootID:aec04ea1-f168-4b94-9143-5f1e29a8fc63,KernelVersion:5.10.123+,OSImage:Container-Optimized OS from Google,ContainerRuntimeVersion:containerd://1.7.0-beta.0-149-gd06318622,KubeletVersion:v1.27.0-alpha.0.50+70617042976dc1,KubeProxyVersion:v1.27.0-alpha.0.50+70617042976dc1,OperatingSystem:linux,Architecture:amd64,},Images:[]ContainerImage{ContainerImage{Names:[registry.k8s.io/kube-proxy-amd64:v1.27.0-alpha.0.50_70617042976dc1],SizeBytes:67201736,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/agnhost@sha256:16bbf38c463a4223d8cfe4da12bc61010b082a79b4bb003e2d3ba3ece5dd5f9e registry.k8s.io/e2e-test-images/agnhost:2.43],SizeBytes:51706353,},ContainerImage{Names:[gke.gcr.io/prometheus-to-sd@sha256:e739643c3939ba0b161425f45a1989eedfc4a3b166db9a7100863296b4c70510 gke.gcr.io/prometheus-to-sd:v0.11.1-gke.1],SizeBytes:48742566,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-provisioner@sha256:ee3b525d5b89db99da3b8eb521d9cd90cb6e9ef0fbb651e98bb37be78d36b5b8 registry.k8s.io/sig-storage/csi-provisioner:v3.3.0],SizeBytes:25491225,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-resizer@sha256:425d8f1b769398127767b06ed97ce62578a3179bcb99809ce93a1649e025ffe7 registry.k8s.io/sig-storage/csi-resizer:v1.6.0],SizeBytes:24148884,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-snapshotter@sha256:291334908ddf71a4661fd7f6d9d97274de8a5378a2b6fdfeb2ce73414a34f82f registry.k8s.io/sig-storage/csi-snapshotter:v6.1.0],SizeBytes:23881995,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-attacher@sha256:9a685020911e2725ad019dbce6e4a5ab93d51e3d4557f115e64343345e05781b registry.k8s.io/sig-storage/csi-attacher:v4.0.0],SizeBytes:23847201,},ContainerImage{Names:[registry.k8s.io/sig-storage/hostpathplugin@sha256:92257881c1d6493cf18299a24af42330f891166560047902b8d431fb66b01af5 registry.k8s.io/sig-storage/hostpathplugin:v1.9.0],SizeBytes:18758628,},ContainerImage{Names:[registry.k8s.io/coredns/coredns@sha256:8e352a029d304ca7431c6507b56800636c321cb52289686a581ab70aaa8a2e2a registry.k8s.io/coredns/coredns:v1.9.3],SizeBytes:14837849,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-node-driver-registrar@sha256:0103eee7c35e3e0b5cd8cdca9850dc71c793cdeb6669d8be7a89440da2d06ae4 registry.k8s.io/sig-storage/csi-node-driver-registrar:v2.5.1],SizeBytes:9133109,},ContainerImage{Names:[registry.k8s.io/sig-storage/livenessprobe@sha256:933940f13b3ea0abc62e656c1aa5c5b47c04b15d71250413a6b821bd0c58b94e registry.k8s.io/sig-storage/livenessprobe:v2.7.0],SizeBytes:8688564,},ContainerImage{Names:[registry.k8s.io/kas-network-proxy/proxy-agent@sha256:48f2a4ec3e10553a81b8dd1c6fa5fe4bcc9617f78e71c1ca89c6921335e2d7da registry.k8s.io/kas-network-proxy/proxy-agent:v0.0.33],SizeBytes:8512162,},ContainerImage{Names:[registry.k8s.io/metadata-proxy@sha256:e914645f22e946bce5165737e1b244e0a296ad1f0f81a9531adc57af2780978a registry.k8s.io/metadata-proxy:v0.1.12],SizeBytes:5301657,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/busybox@sha256:c318242786b139d18676b1c09a0ad7f15fc17f8f16a5b2e625cd0dc8c9703daf registry.k8s.io/e2e-test-images/busybox:1.29-2],SizeBytes:732424,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/busybox@sha256:2e0f836850e09b8b7cc937681d6194537a09fbd5f6b9e08f4d646a85128e8937 registry.k8s.io/e2e-test-images/busybox:1.29-4],SizeBytes:731990,},ContainerImage{Names:[registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097 registry.k8s.io/pause:3.9],SizeBytes:321520,},ContainerImage{Names:[registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d registry.k8s.io/pause:3.8],SizeBytes:311286,},},VolumesInUse:[],VolumesAttached:[]AttachedVolume{},Config:nil,},} Nov 26 18:49:05.947: INFO: Logging kubelet events for node bootstrap-e2e-minion-group-p1wq Nov 26 18:49:05.996: INFO: Logging pods the kubelet thinks is on node bootstrap-e2e-minion-group-p1wq Nov 26 18:49:06.085: INFO: csi-mockplugin-0 started at 2022-11-26 18:41:51 +0000 UTC (0+3 container statuses recorded) Nov 26 18:49:06.085: INFO: Container csi-provisioner ready: true, restart count 3 Nov 26 18:49:06.085: INFO: Container driver-registrar ready: true, restart count 3 Nov 26 18:49:06.085: INFO: Container mock ready: true, restart count 3 Nov 26 18:49:06.085: INFO: kube-proxy-bootstrap-e2e-minion-group-p1wq started at 2022-11-26 18:39:32 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:06.085: INFO: Container kube-proxy ready: true, restart count 5 Nov 26 18:49:06.085: INFO: konnectivity-agent-dk4z2 started at 2022-11-26 18:39:46 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:06.086: INFO: Container konnectivity-agent ready: true, restart count 4 Nov 26 18:49:06.086: INFO: coredns-6d97d5ddb-xg7m7 started at 2022-11-26 18:39:51 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:06.086: INFO: Container coredns ready: false, restart count 5 Nov 26 18:49:06.086: INFO: netserver-2 started at 2022-11-26 18:44:09 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:06.086: INFO: Container webserver ready: true, restart count 5 Nov 26 18:49:06.086: INFO: csi-mockplugin-0 started at 2022-11-26 18:45:36 +0000 UTC (0+4 container statuses recorded) Nov 26 18:49:06.086: INFO: Container busybox ready: false, restart count 2 Nov 26 18:49:06.086: INFO: Container csi-provisioner ready: true, restart count 3 Nov 26 18:49:06.086: INFO: Container driver-registrar ready: true, restart count 3 Nov 26 18:49:06.086: INFO: Container mock ready: true, restart count 3 Nov 26 18:49:06.086: INFO: csi-mockplugin-0 started at 2022-11-26 18:44:44 +0000 UTC (0+4 container statuses recorded) Nov 26 18:49:06.086: INFO: Container busybox ready: true, restart count 3 Nov 26 18:49:06.086: INFO: Container csi-provisioner ready: false, restart count 3 Nov 26 18:49:06.086: INFO: Container driver-registrar ready: false, restart count 4 Nov 26 18:49:06.086: INFO: Container mock ready: false, restart count 4 Nov 26 18:49:06.086: INFO: pod-subpath-test-inlinevolume-gshk started at 2022-11-26 18:49:03 +0000 UTC (1+2 container statuses recorded) Nov 26 18:49:06.086: INFO: Init container init-volume-inlinevolume-gshk ready: true, restart count 0 Nov 26 18:49:06.086: INFO: Container test-container-subpath-inlinevolume-gshk ready: true, restart count 0 Nov 26 18:49:06.086: INFO: Container test-container-volume-inlinevolume-gshk ready: true, restart count 0 Nov 26 18:49:06.086: INFO: netserver-2 started at 2022-11-26 18:46:43 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:06.086: INFO: Container webserver ready: false, restart count 4 Nov 26 18:49:06.086: INFO: metadata-proxy-v0.1-89kmg started at 2022-11-26 18:39:33 +0000 UTC (0+2 container statuses recorded) Nov 26 18:49:06.086: INFO: Container metadata-proxy ready: true, restart count 0 Nov 26 18:49:06.086: INFO: Container prometheus-to-sd-exporter ready: true, restart count 0 Nov 26 18:49:06.086: INFO: csi-hostpathplugin-0 started at 2022-11-26 18:44:06 +0000 UTC (0+7 container statuses recorded) Nov 26 18:49:06.086: INFO: Container csi-attacher ready: false, restart count 3 Nov 26 18:49:06.086: INFO: Container csi-provisioner ready: false, restart count 3 Nov 26 18:49:06.086: INFO: Container csi-resizer ready: false, restart count 3 Nov 26 18:49:06.086: INFO: Container csi-snapshotter ready: false, restart count 3 Nov 26 18:49:06.086: INFO: Container hostpath ready: false, restart count 3 Nov 26 18:49:06.086: INFO: Container liveness-probe ready: false, restart count 3 Nov 26 18:49:06.086: INFO: Container node-driver-registrar ready: false, restart count 3 Nov 26 18:49:06.086: INFO: csi-mockplugin-attacher-0 started at 2022-11-26 18:41:51 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:06.086: INFO: Container csi-attacher ready: true, restart count 4 Nov 26 18:49:06.086: INFO: affinity-lb-esipp-transition-m2r6p started at 2022-11-26 18:45:17 +0000 UTC (0+1 container statuses recorded) Nov 26 18:49:06.086: INFO: Container affinity-lb-esipp-transition ready: true, restart count 0 Nov 26 18:49:06.086: INFO: csi-hostpathplugin-0 started at 2022-11-26 18:46:57 +0000 UTC (0+7 container statuses recorded) Nov 26 18:49:06.086: INFO: Container csi-attacher ready: true, restart count 0 Nov 26 18:49:06.086: INFO: Container csi-provisioner ready: true, restart count 0 Nov 26 18:49:06.086: INFO: Container csi-resizer ready: true, restart count 0 Nov 26 18:49:06.086: INFO: Container csi-snapshotter ready: true, restart count 0 Nov 26 18:49:06.086: INFO: Container hostpath ready: true, restart count 0 Nov 26 18:49:06.086: INFO: Container liveness-probe ready: true, restart count 0 Nov 26 18:49:06.086: INFO: Container node-driver-registrar ready: true, restart count 0 Nov 26 18:49:06.401: INFO: Latency metrics for node bootstrap-e2e-minion-group-p1wq [DeferCleanup (Each)] [sig-network] LoadBalancers ESIPP [Slow] tear down framework | framework.go:193 STEP: Destroying namespace "esipp-5143" for this suite. 11/26/22 18:49:06.401
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-network\]\sLoadBalancers\sESIPP\s\[Slow\]\sshould\sonly\starget\snodes\swith\sendpoints$'
test/e2e/framework/service/util.go:48 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTPWithRetriableErrorCodes({0xc002feb410, 0xe}, 0x1f91, {0xae73300, 0x0, 0x0}, 0xc000a45790?) test/e2e/framework/service/util.go:48 +0x265 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTP(...) test/e2e/framework/service/util.go:29 k8s.io/kubernetes/test/e2e/network.glob..func20.5() test/e2e/network/loadbalancer.go:1404 +0x737from junit_01.xml
[BeforeEach] [sig-network] LoadBalancers ESIPP [Slow] set up framework | framework.go:178 STEP: Creating a kubernetes client 11/26/22 18:41:48.553 Nov 26 18:41:48.554: INFO: >>> kubeConfig: /workspace/.kube/config STEP: Building a namespace api object, basename esipp 11/26/22 18:41:48.555 STEP: Waiting for a default service account to be provisioned in namespace 11/26/22 18:41:48.684 STEP: Waiting for kube-root-ca.crt to be provisioned in namespace 11/26/22 18:41:48.766 [BeforeEach] [sig-network] LoadBalancers ESIPP [Slow] test/e2e/framework/metrics/init/init.go:31 [BeforeEach] [sig-network] LoadBalancers ESIPP [Slow] test/e2e/network/loadbalancer.go:1250 [It] should only target nodes with endpoints test/e2e/network/loadbalancer.go:1346 STEP: creating a service esipp-5938/external-local-nodes with type=LoadBalancer 11/26/22 18:41:49.098 STEP: setting ExternalTrafficPolicy=Local 11/26/22 18:41:49.099 STEP: waiting for loadbalancer for service esipp-5938/external-local-nodes 11/26/22 18:41:49.332 Nov 26 18:41:49.332: INFO: Waiting up to 15m0s for service "external-local-nodes" to have a LoadBalancer STEP: waiting for loadbalancer for service esipp-5938/external-local-nodes 11/26/22 18:44:09.506 Nov 26 18:44:09.506: INFO: Waiting up to 15m0s for service "external-local-nodes" to have a LoadBalancer STEP: Performing setup for networking test in namespace esipp-5938 11/26/22 18:44:09.601 STEP: creating a selector 11/26/22 18:44:09.601 STEP: Creating the service pods in kubernetes 11/26/22 18:44:09.601 Nov 26 18:44:09.601: INFO: Waiting up to 10m0s for all (but 0) nodes to be schedulable Nov 26 18:44:09.938: INFO: Waiting up to 5m0s for pod "netserver-0" in namespace "esipp-5938" to be "running and ready" Nov 26 18:44:10.012: INFO: Pod "netserver-0": Phase="Pending", Reason="", readiness=false. Elapsed: 74.162527ms Nov 26 18:44:10.012: INFO: The phase of Pod netserver-0 is Pending, waiting for it to be Running (with Ready = true) Nov 26 18:44:12.142: INFO: Pod "netserver-0": Phase="Pending", Reason="", readiness=false. Elapsed: 2.204390932s Nov 26 18:44:12.142: INFO: The phase of Pod netserver-0 is Pending, waiting for it to be Running (with Ready = true) Nov 26 18:44:14.093: INFO: Pod "netserver-0": Phase="Pending", Reason="", readiness=false. Elapsed: 4.155089688s Nov 26 18:44:14.093: INFO: The phase of Pod netserver-0 is Pending, waiting for it to be Running (with Ready = true) Nov 26 18:44:16.096: INFO: Pod "netserver-0": Phase="Pending", Reason="", readiness=false. Elapsed: 6.158324438s Nov 26 18:44:16.096: INFO: The phase of Pod netserver-0 is Pending, waiting for it to be Running (with Ready = true) Nov 26 18:44:18.071: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 8.133673607s Nov 26 18:44:18.071: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:44:20.074: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 10.136753098s Nov 26 18:44:20.074: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:44:22.131: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 12.193266403s Nov 26 18:44:22.131: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:44:24.131: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 14.193354298s Nov 26 18:44:24.131: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:44:26.061: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 16.123040995s Nov 26 18:44:26.061: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:44:28.107: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 18.169213327s Nov 26 18:44:28.107: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:44:30.061: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 20.123652834s Nov 26 18:44:30.061: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:44:32.121: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 22.183400553s Nov 26 18:44:32.121: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:44:34.064: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 24.126621937s Nov 26 18:44:34.064: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:44:36.085: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 26.147665592s Nov 26 18:44:36.085: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:44:38.108: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 28.170053772s Nov 26 18:44:38.108: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:44:40.069: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 30.130938863s Nov 26 18:44:40.069: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:44:42.179: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 32.241566036s Nov 26 18:44:42.179: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:44:44.094: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 34.156176556s Nov 26 18:44:44.094: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:44:46.127: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 36.189469056s Nov 26 18:44:46.127: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:44:48.119: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 38.18150277s Nov 26 18:44:48.119: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:44:50.074: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 40.136476006s Nov 26 18:44:50.074: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:44:52.127: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 42.189295456s Nov 26 18:44:52.127: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:44:54.073: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 44.135335446s Nov 26 18:44:54.073: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:44:56.101: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 46.163876182s Nov 26 18:44:56.102: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:44:58.073: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 48.135126375s Nov 26 18:44:58.073: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:45:00.064: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 50.126765343s Nov 26 18:45:00.064: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:45:02.088: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 52.15073188s Nov 26 18:45:02.088: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:45:04.068: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 54.130693216s Nov 26 18:45:04.068: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:45:06.072: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 56.134855088s Nov 26 18:45:06.072: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:45:08.085: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 58.147847796s Nov 26 18:45:08.085: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:45:10.065: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=false. Elapsed: 1m0.126984841s Nov 26 18:45:10.065: INFO: The phase of Pod netserver-0 is Running (Ready = false) Nov 26 18:45:12.175: INFO: Pod "netserver-0": Phase="Running", Reason="", readiness=true. Elapsed: 1m2.236977478s Nov 26 18:45:12.175: INFO: The phase of Pod netserver-0 is Running (Ready = true) Nov 26 18:45:12.175: INFO: Pod "netserver-0" satisfied condition "running and ready" Nov 26 18:45:12.258: INFO: Waiting up to 5m0s for pod "netserver-1" in namespace "esipp-5938" to be "running and ready" Nov 26 18:45:12.310: INFO: Pod "netserver-1": Phase="Running", Reason="", readiness=true. Elapsed: 51.635645ms Nov 26 18:45:12.310: INFO: The phase of Pod netserver-1 is Running (Ready = true) Nov 26 18:45:12.310: INFO: Pod "netserver-1" satisfied condition "running and ready" Nov 26 18:45:12.374: INFO: Waiting up to 5m0s for pod "netserver-2" in namespace "esipp-5938" to be "running and ready" Nov 26 18:45:12.447: INFO: Pod "netserver-2": Phase="Running", Reason="", readiness=false. Elapsed: 73.05442ms Nov 26 18:45:12.447: INFO: The phase of Pod netserver-2 is Running (Ready = false) Nov 26 18:45:14.508: INFO: Pod "netserver-2": Phase="Running", Reason="", readiness=false. Elapsed: 2.133969233s Nov 26 18:45:14.508: INFO: The phase of Pod netserver-2 is Running (Ready = false) Nov 26 18:45:16.514: INFO: Pod "netserver-2": Phase="Running", Reason="", readiness=false. Elapsed: 4.139997634s Nov 26 18:45:16.514: INFO: The phase of Pod netserver-2 is Running (Ready = false) Nov 26 18:45:18.511: INFO: Pod "netserver-2": Phase="Running", Reason="", readiness=false. Elapsed: 6.137741035s Nov 26 18:45:18.512: INFO: The phase of Pod netserver-2 is Running (Ready = false) Nov 26 18:45:20.502: INFO: Pod "netserver-2": Phase="Running", Reason="", readiness=false. Elapsed: 8.12776872s Nov 26 18:45:20.502: INFO: The phase of Pod netserver-2 is Running (Ready = false) Nov 26 18:45:22.496: INFO: Pod "netserver-2": Phase="Running", Reason="", readiness=true. Elapsed: 10.121942147s Nov 26 18:45:22.496: INFO: The phase of Pod netserver-2 is Running (Ready = true) Nov 26 18:45:22.496: INFO: Pod "netserver-2" satisfied condition "running and ready" STEP: Creating test pods 11/26/22 18:45:22.559 Nov 26 18:45:22.732: INFO: Waiting up to 5m0s for pod "test-container-pod" in namespace "esipp-5938" to be "running" Nov 26 18:45:22.818: INFO: Pod "test-container-pod": Phase="Pending", Reason="", readiness=false. Elapsed: 85.585294ms Nov 26 18:45:24.891: INFO: Pod "test-container-pod": Phase="Running", Reason="", readiness=true. Elapsed: 2.158472917s Nov 26 18:45:24.891: INFO: Pod "test-container-pod" satisfied condition "running" Nov 26 18:45:24.938: INFO: Setting MaxTries for pod polling to 39 for networking test based on endpoint count 3 STEP: Getting node addresses 11/26/22 18:45:24.938 Nov 26 18:45:24.938: INFO: Waiting up to 10m0s for all (but 0) nodes to be schedulable STEP: Creating the service on top of the pods in kubernetes 11/26/22 18:45:25.061 Nov 26 18:45:25.222: INFO: Service node-port-service in namespace esipp-5938 found. Nov 26 18:45:25.445: INFO: Service session-affinity-service in namespace esipp-5938 found. STEP: Waiting for NodePort service to expose endpoint 11/26/22 18:45:25.513 Nov 26 18:45:26.513: INFO: Waiting for amount of service:node-port-service endpoints to be 3 STEP: Waiting for Session Affinity service to expose endpoint 11/26/22 18:45:26.612 Nov 26 18:45:27.613: INFO: Waiting for amount of service:session-affinity-service endpoints to be 3 STEP: creating a pod to be part of the service external-local-nodes on node bootstrap-e2e-minion-group-dzls 11/26/22 18:45:27.681 Nov 26 18:45:27.737: INFO: Waiting up to 2m0s for 1 pods to be created Nov 26 18:45:27.813: INFO: Found all 1 pods Nov 26 18:45:27.813: INFO: Waiting up to 2m0s for 1 pods to be running and ready: [external-local-nodes-z9qgt] Nov 26 18:45:27.813: INFO: Waiting up to 2m0s for pod "external-local-nodes-z9qgt" in namespace "esipp-5938" to be "running and ready" Nov 26 18:45:27.890: INFO: Pod "external-local-nodes-z9qgt": Phase="Pending", Reason="", readiness=false. Elapsed: 77.192244ms Nov 26 18:45:27.891: INFO: Error evaluating pod condition running and ready: want pod 'external-local-nodes-z9qgt' on 'bootstrap-e2e-minion-group-dzls' to be 'Running' but was 'Pending' Nov 26 18:45:29.955: INFO: Pod "external-local-nodes-z9qgt": Phase="Running", Reason="", readiness=true. Elapsed: 2.141829279s Nov 26 18:45:29.955: INFO: Pod "external-local-nodes-z9qgt" satisfied condition "running and ready" Nov 26 18:45:29.955: INFO: Wanted all 1 pods to be running and ready. Result: true. Pods: [external-local-nodes-z9qgt] STEP: waiting for service endpoint on node bootstrap-e2e-minion-group-dzls 11/26/22 18:45:29.955 Nov 26 18:45:30.060: INFO: Pod for service esipp-5938/external-local-nodes is on node bootstrap-e2e-minion-group-dzls Nov 26 18:45:30.060: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:45:30.101: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:45:32.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:45:32.141: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:45:34.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:45:34.142: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:45:36.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:45:36.141: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:45:38.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:45:38.142: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:45:40.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:45:40.141: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:45:42.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:45:42.142: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:45:44.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:45:44.142: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:45:46.101: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:45:46.141: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:45:48.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:45:58.103: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Nov 26 18:46:00.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:46:00.142: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:46:02.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:46:12.102: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: i/o timeout (Client.Timeout exceeded while awaiting headers) Nov 26 18:46:12.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:46:22.103: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Nov 26 18:46:24.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:46:24.141: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:46:26.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:46:26.142: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:46:28.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:46:28.142: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:46:30.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:46:30.141: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:46:32.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:46:32.141: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:46:34.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:46:34.142: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:46:36.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:46:36.141: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:46:38.101: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:46:38.141: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:46:40.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" ------------------------------ Progress Report for Ginkgo Process #22 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should only target nodes with endpoints (Spec Runtime: 5m0.443s) test/e2e/network/loadbalancer.go:1346 In [It] (Node Runtime: 5m0s) test/e2e/network/loadbalancer.go:1346 At [By Step] waiting for service endpoint on node bootstrap-e2e-minion-group-dzls (Step Runtime: 1m19.041s) test/e2e/network/loadbalancer.go:1395 Spec Goroutine goroutine 244 [select] net/http.(*Transport).getConn(0xc002bbf900, 0xc003082300, {{}, 0x0, {0xc003442d20, 0x4}, {0xc000547d28, 0x13}, 0x0}) /usr/local/go/src/net/http/transport.go:1376 net/http.(*Transport).roundTrip(0xc002bbf900, 0xc003465400) /usr/local/go/src/net/http/transport.go:582 net/http.(*Transport).RoundTrip(0xc003465400?, 0x7fadc80?) /usr/local/go/src/net/http/roundtrip.go:17 net/http.send(0xc003465200, {0x7fadc80, 0xc002bbf900}, {0x74d54e0?, 0x26b3a01?, 0xae40400?}) /usr/local/go/src/net/http/client.go:251 net/http.(*Client).send(0xc0019c66c0, 0xc003465200, {0x0?, 0x262a61f?, 0xae40400?}) /usr/local/go/src/net/http/client.go:175 net/http.(*Client).do(0xc0019c66c0, 0xc003465200) /usr/local/go/src/net/http/client.go:715 net/http.(*Client).Do(...) /usr/local/go/src/net/http/client.go:581 net/http.(*Client).Get(0x2?, {0xc003442d20?, 0x9?}) /usr/local/go/src/net/http/client.go:479 k8s.io/kubernetes/test/e2e/framework/network.httpGetNoConnectionPoolTimeout({0xc003442d20, 0x29}, 0x2540be400) test/e2e/framework/network/utils.go:1065 k8s.io/kubernetes/test/e2e/framework/network.PokeHTTP({0xc002feb410, 0xe}, 0x1f91, {0x75ddb6b, 0xf}, 0xc000a2bb84?) test/e2e/framework/network/utils.go:998 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTPWithRetriableErrorCodes.func1() test/e2e/framework/service/util.go:35 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.ConditionFunc.WithContext.func1({0x2742871, 0x0}) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:222 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.runConditionWithCrashProtectionWithContext({0x7fe0bc8?, 0xc0000820c8?}, 0x5?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:235 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0000820c8}, 0xc000546a80, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:662 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0000820c8}, 0x50?, 0x2fd9d05?, 0x28?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0000820c8}, 0x2fdaaaa?, 0xc000a2bca0?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0x2fdaa30?, 0x7fe0bc8?, 0xc0000820c8?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTPWithRetriableErrorCodes({0xc002feb410, 0xe}, 0x1f91, {0xae73300, 0x0, 0x0}, 0xc000a45790?) test/e2e/framework/service/util.go:46 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTP(...) test/e2e/framework/service/util.go:29 > k8s.io/kubernetes/test/e2e/network.glob..func20.5() test/e2e/network/loadbalancer.go:1404 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0000cdfa0, 0x2fd9502}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 18:46:50.102: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Nov 26 18:46:52.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:47:02.103: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Nov 26 18:47:04.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:47:04.142: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:47:06.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:47:06.141: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:47:08.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:47:08.142: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #22 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should only target nodes with endpoints (Spec Runtime: 5m20.446s) test/e2e/network/loadbalancer.go:1346 In [It] (Node Runtime: 5m20.003s) test/e2e/network/loadbalancer.go:1346 At [By Step] waiting for service endpoint on node bootstrap-e2e-minion-group-dzls (Step Runtime: 1m39.044s) test/e2e/network/loadbalancer.go:1395 Spec Goroutine goroutine 244 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0000820c8}, 0xc000546a80, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0000820c8}, 0x50?, 0x2fd9d05?, 0x28?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0000820c8}, 0x2fdaaaa?, 0xc000a2bca0?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0x2fdaa30?, 0x7fe0bc8?, 0xc0000820c8?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTPWithRetriableErrorCodes({0xc002feb410, 0xe}, 0x1f91, {0xae73300, 0x0, 0x0}, 0xc000a45790?) test/e2e/framework/service/util.go:46 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTP(...) test/e2e/framework/service/util.go:29 > k8s.io/kubernetes/test/e2e/network.glob..func20.5() test/e2e/network/loadbalancer.go:1404 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0000cdfa0, 0x2fd9502}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 18:47:10.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:47:20.103: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Nov 26 18:47:22.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:47:22.142: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:47:24.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:47:24.141: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:47:26.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:47:26.141: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:47:28.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:47:28.142: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #22 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should only target nodes with endpoints (Spec Runtime: 5m40.448s) test/e2e/network/loadbalancer.go:1346 In [It] (Node Runtime: 5m40.006s) test/e2e/network/loadbalancer.go:1346 At [By Step] waiting for service endpoint on node bootstrap-e2e-minion-group-dzls (Step Runtime: 1m59.046s) test/e2e/network/loadbalancer.go:1395 Spec Goroutine goroutine 244 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0000820c8}, 0xc000546a80, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0000820c8}, 0x50?, 0x2fd9d05?, 0x28?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0000820c8}, 0x2fdaaaa?, 0xc000a2bca0?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0x2fdaa30?, 0x7fe0bc8?, 0xc0000820c8?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTPWithRetriableErrorCodes({0xc002feb410, 0xe}, 0x1f91, {0xae73300, 0x0, 0x0}, 0xc000a45790?) test/e2e/framework/service/util.go:46 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTP(...) test/e2e/framework/service/util.go:29 > k8s.io/kubernetes/test/e2e/network.glob..func20.5() test/e2e/network/loadbalancer.go:1404 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0000cdfa0, 0x2fd9502}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 18:47:30.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:47:40.103: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Nov 26 18:47:42.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:47:42.142: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:47:44.101: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:47:44.142: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:47:46.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:47:46.141: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:47:48.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:47:48.142: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #22 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should only target nodes with endpoints (Spec Runtime: 6m0.452s) test/e2e/network/loadbalancer.go:1346 In [It] (Node Runtime: 6m0.009s) test/e2e/network/loadbalancer.go:1346 At [By Step] waiting for service endpoint on node bootstrap-e2e-minion-group-dzls (Step Runtime: 2m19.05s) test/e2e/network/loadbalancer.go:1395 Spec Goroutine goroutine 244 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0000820c8}, 0xc000546a80, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0000820c8}, 0x50?, 0x2fd9d05?, 0x28?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0000820c8}, 0x2fdaaaa?, 0xc000a2bca0?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0x2fdaa30?, 0x7fe0bc8?, 0xc0000820c8?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTPWithRetriableErrorCodes({0xc002feb410, 0xe}, 0x1f91, {0xae73300, 0x0, 0x0}, 0xc000a45790?) test/e2e/framework/service/util.go:46 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTP(...) test/e2e/framework/service/util.go:29 > k8s.io/kubernetes/test/e2e/network.glob..func20.5() test/e2e/network/loadbalancer.go:1404 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0000cdfa0, 0x2fd9502}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 18:47:50.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:47:50.141: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:47:52.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:47:52.141: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:47:54.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:48:04.102: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Nov 26 18:48:06.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:48:06.142: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:48:08.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:48:08.142: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #22 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should only target nodes with endpoints (Spec Runtime: 6m20.454s) test/e2e/network/loadbalancer.go:1346 In [It] (Node Runtime: 6m20.011s) test/e2e/network/loadbalancer.go:1346 At [By Step] waiting for service endpoint on node bootstrap-e2e-minion-group-dzls (Step Runtime: 2m39.052s) test/e2e/network/loadbalancer.go:1395 Spec Goroutine goroutine 244 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0000820c8}, 0xc000546a80, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0000820c8}, 0x50?, 0x2fd9d05?, 0x28?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0000820c8}, 0x2fdaaaa?, 0xc000a2bca0?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0x2fdaa30?, 0x7fe0bc8?, 0xc0000820c8?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTPWithRetriableErrorCodes({0xc002feb410, 0xe}, 0x1f91, {0xae73300, 0x0, 0x0}, 0xc000a45790?) test/e2e/framework/service/util.go:46 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTP(...) test/e2e/framework/service/util.go:29 > k8s.io/kubernetes/test/e2e/network.glob..func20.5() test/e2e/network/loadbalancer.go:1404 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0000cdfa0, 0x2fd9502}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 18:48:10.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:48:20.103: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Nov 26 18:48:22.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:48:22.142: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:48:24.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:48:24.141: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:48:26.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:48:26.141: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:48:28.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:48:28.142: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #22 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should only target nodes with endpoints (Spec Runtime: 6m40.456s) test/e2e/network/loadbalancer.go:1346 In [It] (Node Runtime: 6m40.014s) test/e2e/network/loadbalancer.go:1346 At [By Step] waiting for service endpoint on node bootstrap-e2e-minion-group-dzls (Step Runtime: 2m59.054s) test/e2e/network/loadbalancer.go:1395 Spec Goroutine goroutine 244 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0000820c8}, 0xc000546a80, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0000820c8}, 0x50?, 0x2fd9d05?, 0x28?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0000820c8}, 0x2fdaaaa?, 0xc000a2bca0?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0x2fdaa30?, 0x7fe0bc8?, 0xc0000820c8?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTPWithRetriableErrorCodes({0xc002feb410, 0xe}, 0x1f91, {0xae73300, 0x0, 0x0}, 0xc000a45790?) test/e2e/framework/service/util.go:46 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTP(...) test/e2e/framework/service/util.go:29 > k8s.io/kubernetes/test/e2e/network.glob..func20.5() test/e2e/network/loadbalancer.go:1404 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0000cdfa0, 0x2fd9502}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 18:48:30.101: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:48:30.141: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:48:32.101: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:48:32.141: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:48:34.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:48:44.103: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Nov 26 18:48:46.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:48:46.143: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:48:48.101: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:48:48.141: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #22 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should only target nodes with endpoints (Spec Runtime: 7m0.459s) test/e2e/network/loadbalancer.go:1346 In [It] (Node Runtime: 7m0.017s) test/e2e/network/loadbalancer.go:1346 At [By Step] waiting for service endpoint on node bootstrap-e2e-minion-group-dzls (Step Runtime: 3m19.057s) test/e2e/network/loadbalancer.go:1395 Spec Goroutine goroutine 244 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0000820c8}, 0xc000546a80, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0000820c8}, 0x50?, 0x2fd9d05?, 0x28?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0000820c8}, 0x2fdaaaa?, 0xc000a2bca0?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0x2fdaa30?, 0x7fe0bc8?, 0xc0000820c8?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTPWithRetriableErrorCodes({0xc002feb410, 0xe}, 0x1f91, {0xae73300, 0x0, 0x0}, 0xc000a45790?) test/e2e/framework/service/util.go:46 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTP(...) test/e2e/framework/service/util.go:29 > k8s.io/kubernetes/test/e2e/network.glob..func20.5() test/e2e/network/loadbalancer.go:1404 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0000cdfa0, 0x2fd9502}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 18:48:50.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:48:50.142: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:48:52.101: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:48:52.141: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:48:54.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:48:54.142: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:48:56.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:49:06.103: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Nov 26 18:49:08.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:49:08.143: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #22 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should only target nodes with endpoints (Spec Runtime: 7m20.463s) test/e2e/network/loadbalancer.go:1346 In [It] (Node Runtime: 7m20.021s) test/e2e/network/loadbalancer.go:1346 At [By Step] waiting for service endpoint on node bootstrap-e2e-minion-group-dzls (Step Runtime: 3m39.062s) test/e2e/network/loadbalancer.go:1395 Spec Goroutine goroutine 244 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0000820c8}, 0xc000546a80, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0000820c8}, 0x50?, 0x2fd9d05?, 0x28?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0000820c8}, 0x2fdaaaa?, 0xc000a2bca0?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0x2fdaa30?, 0x7fe0bc8?, 0xc0000820c8?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTPWithRetriableErrorCodes({0xc002feb410, 0xe}, 0x1f91, {0xae73300, 0x0, 0x0}, 0xc000a45790?) test/e2e/framework/service/util.go:46 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTP(...) test/e2e/framework/service/util.go:29 > k8s.io/kubernetes/test/e2e/network.glob..func20.5() test/e2e/network/loadbalancer.go:1404 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0000cdfa0, 0x2fd9502}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 18:49:10.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:49:20.103: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Nov 26 18:49:22.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:49:22.143: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:49:24.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:49:24.142: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:49:26.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" ------------------------------ Progress Report for Ginkgo Process #22 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should only target nodes with endpoints (Spec Runtime: 7m40.465s) test/e2e/network/loadbalancer.go:1346 In [It] (Node Runtime: 7m40.023s) test/e2e/network/loadbalancer.go:1346 At [By Step] waiting for service endpoint on node bootstrap-e2e-minion-group-dzls (Step Runtime: 3m59.063s) test/e2e/network/loadbalancer.go:1395 Spec Goroutine goroutine 244 [select] net/http.(*Transport).getConn(0xc000ad3b80, 0xc003082040, {{}, 0x0, {0xc0031d20f0, 0x4}, {0xc0013ea2d0, 0x13}, 0x0}) /usr/local/go/src/net/http/transport.go:1376 net/http.(*Transport).roundTrip(0xc000ad3b80, 0xc004058100) /usr/local/go/src/net/http/transport.go:582 net/http.(*Transport).RoundTrip(0xc004058100?, 0x7fadc80?) /usr/local/go/src/net/http/roundtrip.go:17 net/http.send(0xc004058000, {0x7fadc80, 0xc000ad3b80}, {0x74d54e0?, 0x26b3a01?, 0xae40400?}) /usr/local/go/src/net/http/client.go:251 net/http.(*Client).send(0xc0019c66c0, 0xc004058000, {0x0?, 0xc000a2b500?, 0xae40400?}) /usr/local/go/src/net/http/client.go:175 net/http.(*Client).do(0xc0019c66c0, 0xc004058000) /usr/local/go/src/net/http/client.go:715 net/http.(*Client).Do(...) /usr/local/go/src/net/http/client.go:581 net/http.(*Client).Get(0x2?, {0xc0031d20f0?, 0x9?}) /usr/local/go/src/net/http/client.go:479 k8s.io/kubernetes/test/e2e/framework/network.httpGetNoConnectionPoolTimeout({0xc0031d20f0, 0x29}, 0x2540be400) test/e2e/framework/network/utils.go:1065 k8s.io/kubernetes/test/e2e/framework/network.PokeHTTP({0xc002feb410, 0xe}, 0x1f91, {0x75ddb6b, 0xf}, 0xc000a2bb84?) test/e2e/framework/network/utils.go:998 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTPWithRetriableErrorCodes.func1() test/e2e/framework/service/util.go:35 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.ConditionFunc.WithContext.func1({0x2742871, 0x0}) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:222 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.runConditionWithCrashProtectionWithContext({0x7fe0bc8?, 0xc0000820c8?}, 0x5?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:235 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0000820c8}, 0xc000546a80, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:662 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0000820c8}, 0x50?, 0x2fd9d05?, 0x28?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0000820c8}, 0x2fdaaaa?, 0xc000a2bca0?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0x2fdaa30?, 0x7fe0bc8?, 0xc0000820c8?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTPWithRetriableErrorCodes({0xc002feb410, 0xe}, 0x1f91, {0xae73300, 0x0, 0x0}, 0xc000a45790?) test/e2e/framework/service/util.go:46 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTP(...) test/e2e/framework/service/util.go:29 > k8s.io/kubernetes/test/e2e/network.glob..func20.5() test/e2e/network/loadbalancer.go:1404 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0000cdfa0, 0x2fd9502}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 18:49:36.103: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Nov 26 18:49:38.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:49:38.143: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:49:40.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" ------------------------------ Progress Report for Ginkgo Process #22 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should only target nodes with endpoints (Spec Runtime: 8m0.467s) test/e2e/network/loadbalancer.go:1346 In [It] (Node Runtime: 8m0.025s) test/e2e/network/loadbalancer.go:1346 At [By Step] waiting for service endpoint on node bootstrap-e2e-minion-group-dzls (Step Runtime: 4m19.065s) test/e2e/network/loadbalancer.go:1395 Spec Goroutine goroutine 244 [select] net/http.(*Transport).getConn(0xc000830dc0, 0xc000a5cd80, {{}, 0x0, {0xc003692d80, 0x4}, {0xc0031e2480, 0x13}, 0x0}) /usr/local/go/src/net/http/transport.go:1376 net/http.(*Transport).roundTrip(0xc000830dc0, 0xc001896b00) /usr/local/go/src/net/http/transport.go:582 net/http.(*Transport).RoundTrip(0xc001896b00?, 0x7fadc80?) /usr/local/go/src/net/http/roundtrip.go:17 net/http.send(0xc001896a00, {0x7fadc80, 0xc000830dc0}, {0x74d54e0?, 0x26b3a01?, 0xae40400?}) /usr/local/go/src/net/http/client.go:251 net/http.(*Client).send(0xc0001728a0, 0xc001896a00, {0x0?, 0x262a61f?, 0xae40400?}) /usr/local/go/src/net/http/client.go:175 net/http.(*Client).do(0xc0001728a0, 0xc001896a00) /usr/local/go/src/net/http/client.go:715 net/http.(*Client).Do(...) /usr/local/go/src/net/http/client.go:581 net/http.(*Client).Get(0x2?, {0xc003692d80?, 0x9?}) /usr/local/go/src/net/http/client.go:479 k8s.io/kubernetes/test/e2e/framework/network.httpGetNoConnectionPoolTimeout({0xc003692d80, 0x29}, 0x2540be400) test/e2e/framework/network/utils.go:1065 k8s.io/kubernetes/test/e2e/framework/network.PokeHTTP({0xc002feb410, 0xe}, 0x1f91, {0x75ddb6b, 0xf}, 0xc000a2bb84?) test/e2e/framework/network/utils.go:998 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTPWithRetriableErrorCodes.func1() test/e2e/framework/service/util.go:35 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.ConditionFunc.WithContext.func1({0x2742871, 0x0}) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:222 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.runConditionWithCrashProtectionWithContext({0x7fe0bc8?, 0xc0000820c8?}, 0x5?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:235 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0000820c8}, 0xc000546a80, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:662 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0000820c8}, 0x50?, 0x2fd9d05?, 0x28?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0000820c8}, 0x2fdaaaa?, 0xc000a2bca0?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0x2fdaa30?, 0x7fe0bc8?, 0xc0000820c8?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTPWithRetriableErrorCodes({0xc002feb410, 0xe}, 0x1f91, {0xae73300, 0x0, 0x0}, 0xc000a45790?) test/e2e/framework/service/util.go:46 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTP(...) test/e2e/framework/service/util.go:29 > k8s.io/kubernetes/test/e2e/network.glob..func20.5() test/e2e/network/loadbalancer.go:1404 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0000cdfa0, 0x2fd9502}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 18:49:50.103: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Nov 26 18:49:52.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:49:52.142: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:49:54.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:49:54.142: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:49:56.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:49:56.141: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:49:58.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:50:08.102: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": context deadline exceeded (Client.Timeout exceeded while awaiting headers) ------------------------------ Progress Report for Ginkgo Process #22 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should only target nodes with endpoints (Spec Runtime: 8m20.469s) test/e2e/network/loadbalancer.go:1346 In [It] (Node Runtime: 8m20.027s) test/e2e/network/loadbalancer.go:1346 At [By Step] waiting for service endpoint on node bootstrap-e2e-minion-group-dzls (Step Runtime: 4m39.068s) test/e2e/network/loadbalancer.go:1395 Spec Goroutine goroutine 244 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0000820c8}, 0xc000546a80, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0000820c8}, 0x50?, 0x2fd9d05?, 0x28?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0000820c8}, 0x2fdaaaa?, 0xc000a2bca0?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0x2fdaa30?, 0x7fe0bc8?, 0xc0000820c8?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTPWithRetriableErrorCodes({0xc002feb410, 0xe}, 0x1f91, {0xae73300, 0x0, 0x0}, 0xc000a45790?) test/e2e/framework/service/util.go:46 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTP(...) test/e2e/framework/service/util.go:29 > k8s.io/kubernetes/test/e2e/network.glob..func20.5() test/e2e/network/loadbalancer.go:1404 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0000cdfa0, 0x2fd9502}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 18:50:10.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:50:10.143: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:50:12.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:50:12.142: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:50:14.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:50:24.103: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Nov 26 18:50:26.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:50:26.142: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:50:28.102: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:50:28.142: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #22 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should only target nodes with endpoints (Spec Runtime: 8m40.471s) test/e2e/network/loadbalancer.go:1346 In [It] (Node Runtime: 8m40.029s) test/e2e/network/loadbalancer.go:1346 At [By Step] waiting for service endpoint on node bootstrap-e2e-minion-group-dzls (Step Runtime: 4m59.069s) test/e2e/network/loadbalancer.go:1395 Spec Goroutine goroutine 244 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0000820c8}, 0xc000546a80, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0000820c8}, 0x50?, 0x2fd9d05?, 0x28?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0000820c8}, 0x2fdaaaa?, 0xc000a2bca0?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0x2fdaa30?, 0x7fe0bc8?, 0xc0000820c8?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTPWithRetriableErrorCodes({0xc002feb410, 0xe}, 0x1f91, {0xae73300, 0x0, 0x0}, 0xc000a45790?) test/e2e/framework/service/util.go:46 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTP(...) test/e2e/framework/service/util.go:29 > k8s.io/kubernetes/test/e2e/network.glob..func20.5() test/e2e/network/loadbalancer.go:1404 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0000cdfa0, 0x2fd9502}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 18:50:30.101: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:50:30.141: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:50:30.141: INFO: Poking "http://34.168.232.208:8081/echo?msg=hello" Nov 26 18:50:30.181: INFO: Poke("http://34.168.232.208:8081/echo?msg=hello"): Get "http://34.168.232.208:8081/echo?msg=hello": dial tcp 34.168.232.208:8081: connect: connection refused Nov 26 18:50:30.181: FAIL: Could not reach HTTP service through 34.168.232.208:8081 after 5m0s Full Stack Trace k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTPWithRetriableErrorCodes({0xc002feb410, 0xe}, 0x1f91, {0xae73300, 0x0, 0x0}, 0xc000a45790?) test/e2e/framework/service/util.go:48 +0x265 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTP(...) test/e2e/framework/service/util.go:29 k8s.io/kubernetes/test/e2e/network.glob..func20.5() test/e2e/network/loadbalancer.go:1404 +0x737 Nov 26 18:50:30.359: INFO: Waiting up to 15m0s for service "external-local-nodes" to have no LoadBalancer [AfterEach] [sig-network] LoadBalancers ESIPP [Slow] test/e2e/framework/node/init/init.go:32 Nov 26 18:50:40.613: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready [AfterEach] [sig-network] LoadBalancers ESIPP [Slow] test/e2e/network/loadbalancer.go:1260 Nov 26 18:50:40.697: INFO: Output of kubectl describe svc: Nov 26 18:50:40.697: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=esipp-5938 describe svc --namespace=esipp-5938' Nov 26 18:50:41.293: INFO: stderr: "" Nov 26 18:50:41.293: INFO: stdout: "Name: external-local-nodes\nNamespace: esipp-5938\nLabels: testid=external-local-nodes-e30424ee-0c42-4bb8-8ccc-7c15aba51e13\nAnnotations: <none>\nSelector: testid=external-local-nodes-e30424ee-0c42-4bb8-8ccc-7c15aba51e13\nType: ClusterIP\nIP Family Policy: SingleStack\nIP Families: IPv4\nIP: 10.0.141.61\nIPs: 10.0.141.61\nPort: <unset> 8081/TCP\nTargetPort: 80/TCP\nEndpoints: \nSession Affinity: None\nEvents:\n Type Reason Age From Message\n ---- ------ ---- ---- -------\n Normal EnsuringLoadBalancer 7m9s service-controller Ensuring load balancer\n Normal EnsuredLoadBalancer 6m32s service-controller Ensured load balancer\n\n\nName: node-port-service\nNamespace: esipp-5938\nLabels: <none>\nAnnotations: <none>\nSelector: selector-197e9133-fc17-4f95-be68-4f42f2bbed05=true\nType: NodePort\nIP Family Policy: SingleStack\nIP Families: IPv4\nIP: 10.0.109.253\nIPs: 10.0.109.253\nPort: http 80/TCP\nTargetPort: 8083/TCP\nNodePort: http 32127/TCP\nEndpoints: 10.64.1.63:8083,10.64.3.112:8083\nPort: udp 90/UDP\nTargetPort: 8081/UDP\nNodePort: udp 31576/UDP\nEndpoints: 10.64.1.63:8081,10.64.3.112:8081\nSession Affinity: None\nExternal Traffic Policy: Cluster\nEvents: <none>\n\n\nName: session-affinity-service\nNamespace: esipp-5938\nLabels: <none>\nAnnotations: <none>\nSelector: selector-197e9133-fc17-4f95-be68-4f42f2bbed05=true\nType: NodePort\nIP Family Policy: SingleStack\nIP Families: IPv4\nIP: 10.0.21.18\nIPs: 10.0.21.18\nPort: http 80/TCP\nTargetPort: 8083/TCP\nNodePort: http 30156/TCP\nEndpoints: 10.64.1.63:8083,10.64.3.112:8083\nPort: udp 90/UDP\nTargetPort: 8081/UDP\nNodePort: udp 30414/UDP\nEndpoints: 10.64.1.63:8081,10.64.3.112:8081\nSession Affinity: ClientIP\nExternal Traffic Policy: Cluster\nEvents: <none>\n" Nov 26 18:50:41.293: INFO: Name: external-local-nodes Namespace: esipp-5938 Labels: testid=external-local-nodes-e30424ee-0c42-4bb8-8ccc-7c15aba51e13 Annotations: <none> Selector: testid=external-local-nodes-e30424ee-0c42-4bb8-8ccc-7c15aba51e13 Type: ClusterIP IP Family Policy: SingleStack IP Families: IPv4 IP: 10.0.141.61 IPs: 10.0.141.61 Port: <unset> 8081/TCP TargetPort: 80/TCP Endpoints: Session Affinity: None Events: Type Reason Age From Message ---- ------ ---- ---- ------- Normal EnsuringLoadBalancer 7m9s service-controller Ensuring load balancer Normal EnsuredLoadBalancer 6m32s service-controller Ensured load balancer Name: node-port-service Namespace: esipp-5938 Labels: <none> Annotations: <none> Selector: selector-197e9133-fc17-4f95-be68-4f42f2bbed05=true Type: NodePort IP Family Policy: SingleStack IP Families: IPv4 IP: 10.0.109.253 IPs: 10.0.109.253 Port: http 80/TCP TargetPort: 8083/TCP NodePort: http 32127/TCP Endpoints: 10.64.1.63:8083,10.64.3.112:8083 Port: udp 90/UDP TargetPort: 8081/UDP NodePort: udp 31576/UDP Endpoints: 10.64.1.63:8081,10.64.3.112:8081 Session Affinity: None External Traffic Policy: Cluster Events: <none> Name: session-affinity-service Namespace: esipp-5938 Labels: <none> Annotations: <none> Selector: selector-197e9133-fc17-4f95-be68-4f42f2bbed05=true Type: NodePort IP Family Policy: SingleStack IP Families: IPv4 IP: 10.0.21.18 IPs: 10.0.21.18 Port: http 80/TCP TargetPort: 8083/TCP NodePort: http 30156/TCP Endpoints: 10.64.1.63:8083,10.64.3.112:8083 Port: udp 90/UDP TargetPort: 8081/UDP NodePort: udp 30414/UDP Endpoints: 10.64.1.63:8081,10.64.3.112:8081 Session Affinity: ClientIP External Traffic Policy: Cluster Events: <none> [DeferCleanup (Each)] [sig-network] LoadBalancers ESIPP [Slow] test/e2e/framework/metrics/init/init.go:33 [DeferCleanup (Each)] [sig-network] LoadBalancers ESIPP [Slow] dump namespaces | framework.go:196 STEP: dump namespace information after failure 11/26/22 18:50:41.293 STEP: Collecting events from namespace "esipp-5938". 11/26/22 18:50:41.293 STEP: Found 36 events. 11/26/22 18:50:41.336 Nov 26 18:50:41.336: INFO: At 2022-11-26 18:43:32 +0000 UTC - event for external-local-nodes: {service-controller } EnsuringLoadBalancer: Ensuring load balancer Nov 26 18:50:41.336: INFO: At 2022-11-26 18:44:09 +0000 UTC - event for external-local-nodes: {service-controller } EnsuredLoadBalancer: Ensured load balancer Nov 26 18:50:41.336: INFO: At 2022-11-26 18:44:09 +0000 UTC - event for netserver-0: {default-scheduler } Scheduled: Successfully assigned esipp-5938/netserver-0 to bootstrap-e2e-minion-group-dzls Nov 26 18:50:41.336: INFO: At 2022-11-26 18:44:09 +0000 UTC - event for netserver-1: {default-scheduler } Scheduled: Successfully assigned esipp-5938/netserver-1 to bootstrap-e2e-minion-group-gnb8 Nov 26 18:50:41.336: INFO: At 2022-11-26 18:44:09 +0000 UTC - event for netserver-2: {default-scheduler } Scheduled: Successfully assigned esipp-5938/netserver-2 to bootstrap-e2e-minion-group-p1wq Nov 26 18:50:41.336: INFO: At 2022-11-26 18:44:11 +0000 UTC - event for netserver-0: {kubelet bootstrap-e2e-minion-group-dzls} Pulled: Container image "registry.k8s.io/e2e-test-images/agnhost:2.43" already present on machine Nov 26 18:50:41.336: INFO: At 2022-11-26 18:44:11 +0000 UTC - event for netserver-0: {kubelet bootstrap-e2e-minion-group-dzls} Started: Started container webserver Nov 26 18:50:41.336: INFO: At 2022-11-26 18:44:11 +0000 UTC - event for netserver-0: {kubelet bootstrap-e2e-minion-group-dzls} Created: Created container webserver Nov 26 18:50:41.336: INFO: At 2022-11-26 18:44:11 +0000 UTC - event for netserver-1: {kubelet bootstrap-e2e-minion-group-gnb8} Pulled: Container image "registry.k8s.io/e2e-test-images/agnhost:2.43" already present on machine Nov 26 18:50:41.336: INFO: At 2022-11-26 18:44:11 +0000 UTC - event for netserver-1: {kubelet bootstrap-e2e-minion-group-gnb8} Killing: Stopping container webserver Nov 26 18:50:41.336: INFO: At 2022-11-26 18:44:11 +0000 UTC - event for netserver-1: {kubelet bootstrap-e2e-minion-group-gnb8} Started: Started container webserver Nov 26 18:50:41.336: INFO: At 2022-11-26 18:44:11 +0000 UTC - event for netserver-1: {kubelet bootstrap-e2e-minion-group-gnb8} Created: Created container webserver Nov 26 18:50:41.336: INFO: At 2022-11-26 18:44:11 +0000 UTC - event for netserver-2: {kubelet bootstrap-e2e-minion-group-p1wq} Started: Started container webserver Nov 26 18:50:41.336: INFO: At 2022-11-26 18:44:11 +0000 UTC - event for netserver-2: {kubelet bootstrap-e2e-minion-group-p1wq} Created: Created container webserver Nov 26 18:50:41.336: INFO: At 2022-11-26 18:44:11 +0000 UTC - event for netserver-2: {kubelet bootstrap-e2e-minion-group-p1wq} Pulled: Container image "registry.k8s.io/e2e-test-images/agnhost:2.43" already present on machine Nov 26 18:50:41.336: INFO: At 2022-11-26 18:44:11 +0000 UTC - event for netserver-2: {kubelet bootstrap-e2e-minion-group-p1wq} Killing: Stopping container webserver Nov 26 18:50:41.336: INFO: At 2022-11-26 18:44:12 +0000 UTC - event for netserver-1: {kubelet bootstrap-e2e-minion-group-gnb8} SandboxChanged: Pod sandbox changed, it will be killed and re-created. Nov 26 18:50:41.336: INFO: At 2022-11-26 18:44:13 +0000 UTC - event for netserver-2: {kubelet bootstrap-e2e-minion-group-p1wq} SandboxChanged: Pod sandbox changed, it will be killed and re-created. Nov 26 18:50:41.336: INFO: At 2022-11-26 18:44:16 +0000 UTC - event for netserver-1: {kubelet bootstrap-e2e-minion-group-gnb8} BackOff: Back-off restarting failed container webserver in pod netserver-1_esipp-5938(df83a3b6-02fe-4b9e-80fb-abf8398755cf) Nov 26 18:50:41.336: INFO: At 2022-11-26 18:44:16 +0000 UTC - event for netserver-2: {kubelet bootstrap-e2e-minion-group-p1wq} BackOff: Back-off restarting failed container webserver in pod netserver-2_esipp-5938(98bc576a-5b3d-41fc-9dec-c1dfa7fb58a1) Nov 26 18:50:41.336: INFO: At 2022-11-26 18:44:30 +0000 UTC - event for netserver-0: {kubelet bootstrap-e2e-minion-group-dzls} Killing: Stopping container webserver Nov 26 18:50:41.336: INFO: At 2022-11-26 18:44:31 +0000 UTC - event for netserver-0: {kubelet bootstrap-e2e-minion-group-dzls} SandboxChanged: Pod sandbox changed, it will be killed and re-created. Nov 26 18:50:41.336: INFO: At 2022-11-26 18:44:34 +0000 UTC - event for netserver-0: {kubelet bootstrap-e2e-minion-group-dzls} BackOff: Back-off restarting failed container webserver in pod netserver-0_esipp-5938(27b42c78-6945-4b40-9345-e948f56b9561) Nov 26 18:50:41.336: INFO: At 2022-11-26 18:45:22 +0000 UTC - event for test-container-pod: {default-scheduler } Scheduled: Successfully assigned esipp-5938/test-container-pod to bootstrap-e2e-minion-group-dzls Nov 26 18:50:41.336: INFO: At 2022-11-26 18:45:23 +0000 UTC - event for test-container-pod: {kubelet bootstrap-e2e-minion-group-dzls} Pulled: Container image "registry.k8s.io/e2e-test-images/agnhost:2.43" already present on machine Nov 26 18:50:41.336: INFO: At 2022-11-26 18:45:23 +0000 UTC - event for test-container-pod: {kubelet bootstrap-e2e-minion-group-dzls} Created: Created container webserver Nov 26 18:50:41.336: INFO: At 2022-11-26 18:45:23 +0000 UTC - event for test-container-pod: {kubelet bootstrap-e2e-minion-group-dzls} Started: Started container webserver Nov 26 18:50:41.336: INFO: At 2022-11-26 18:45:27 +0000 UTC - event for external-local-nodes: {replication-controller } SuccessfulCreate: Created pod: external-local-nodes-z9qgt Nov 26 18:50:41.336: INFO: At 2022-11-26 18:45:28 +0000 UTC - event for external-local-nodes-z9qgt: {kubelet bootstrap-e2e-minion-group-dzls} Pulled: Container image "registry.k8s.io/e2e-test-images/agnhost:2.43" already present on machine Nov 26 18:50:41.336: INFO: At 2022-11-26 18:45:28 +0000 UTC - event for external-local-nodes-z9qgt: {kubelet bootstrap-e2e-minion-group-dzls} Created: Created container netexec Nov 26 18:50:41.336: INFO: At 2022-11-26 18:45:28 +0000 UTC - event for external-local-nodes-z9qgt: {kubelet bootstrap-e2e-minion-group-dzls} Started: Started container netexec Nov 26 18:50:41.336: INFO: At 2022-11-26 18:45:29 +0000 UTC - event for external-local-nodes-z9qgt: {kubelet bootstrap-e2e-minion-group-dzls} Killing: Stopping container netexec Nov 26 18:50:41.336: INFO: At 2022-11-26 18:45:30 +0000 UTC - event for external-local-nodes-z9qgt: {kubelet bootstrap-e2e-minion-group-dzls} SandboxChanged: Pod sandbox changed, it will be killed and re-created. Nov 26 18:50:41.336: INFO: At 2022-11-26 18:45:34 +0000 UTC - event for external-local-nodes-z9qgt: {kubelet bootstrap-e2e-minion-group-dzls} BackOff: Back-off restarting failed container netexec in pod external-local-nodes-z9qgt_esipp-5938(0578d815-38e9-4e84-837f-01f490e04b00) Nov 26 18:50:41.336: INFO: At 2022-11-26 18:45:35 +0000 UTC - event for external-local-nodes-z9qgt: {kubelet bootstrap-e2e-minion-group-dzls} Unhealthy: Readiness probe failed: Get "http://10.64.3.64:80/hostName": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Nov 26 18:50:41.336: INFO: At 2022-11-26 18:45:53 +0000 UTC - event for external-local-nodes-z9qgt: {kubelet bootstrap-e2e-minion-group-dzls} Unhealthy: Readiness probe failed: Get "http://10.64.3.65:80/hostName": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Nov 26 18:50:41.380: INFO: POD NODE PHASE GRACE CONDITIONS Nov 26 18:50:41.380: INFO: external-local-nodes-z9qgt bootstrap-e2e-minion-group-dzls Running [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:45:27 +0000 UTC } {Ready False 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:48:34 +0000 UTC ContainersNotReady containers with unready status: [netexec]} {ContainersReady False 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:48:34 +0000 UTC ContainersNotReady containers with unready status: [netexec]} {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:45:27 +0000 UTC }] Nov 26 18:50:41.380: INFO: netserver-0 bootstrap-e2e-minion-group-dzls Running [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:44:09 +0000 UTC } {Ready True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:49:00 +0000 UTC } {ContainersReady True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:49:00 +0000 UTC } {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:44:09 +0000 UTC }] Nov 26 18:50:41.380: INFO: netserver-1 bootstrap-e2e-minion-group-gnb8 Running [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:44:09 +0000 UTC } {Ready True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:49:20 +0000 UTC } {ContainersReady True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:49:20 +0000 UTC } {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:44:09 +0000 UTC }] Nov 26 18:50:41.380: INFO: netserver-2 bootstrap-e2e-minion-group-p1wq Running [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:44:09 +0000 UTC } {Ready False 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:50:03 +0000 UTC ContainersNotReady containers with unready status: [webserver]} {ContainersReady False 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:50:03 +0000 UTC ContainersNotReady containers with unready status: [webserver]} {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:44:09 +0000 UTC }] Nov 26 18:50:41.380: INFO: test-container-pod bootstrap-e2e-minion-group-dzls Running [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:45:22 +0000 UTC } {Ready True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:45:24 +0000 UTC } {ContainersReady True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:45:24 +0000 UTC } {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2022-11-26 18:45:22 +0000 UTC }] Nov 26 18:50:41.380: INFO: Nov 26 18:50:41.769: INFO: Logging node info for node bootstrap-e2e-master Nov 26 18:50:41.813: INFO: Node Info: &Node{ObjectMeta:{bootstrap-e2e-master 3b91a491-10ed-470d-8d9a-7e47529f6987 4734 0 2022-11-26 18:39:29 +0000 UTC <nil> <nil> map[beta.kubernetes.io/arch:amd64 beta.kubernetes.io/instance-type:n1-standard-1 beta.kubernetes.io/os:linux cloud.google.com/metadata-proxy-ready:true failure-domain.beta.kubernetes.io/region:us-west1 failure-domain.beta.kubernetes.io/zone:us-west1-b kubernetes.io/arch:amd64 kubernetes.io/hostname:bootstrap-e2e-master kubernetes.io/os:linux node.kubernetes.io/instance-type:n1-standard-1 topology.kubernetes.io/region:us-west1 topology.kubernetes.io/zone:us-west1-b] map[node.alpha.kubernetes.io/ttl:0 volumes.kubernetes.io/controller-managed-attach-detach:true] [] [] [{kubelet Update v1 2022-11-26 18:39:29 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-attach-detach":{}},"f:labels":{".":{},"f:beta.kubernetes.io/arch":{},"f:beta.kubernetes.io/instance-type":{},"f:beta.kubernetes.io/os":{},"f:cloud.google.com/metadata-proxy-ready":{},"f:failure-domain.beta.kubernetes.io/region":{},"f:failure-domain.beta.kubernetes.io/zone":{},"f:kubernetes.io/arch":{},"f:kubernetes.io/hostname":{},"f:kubernetes.io/os":{},"f:node.kubernetes.io/instance-type":{},"f:topology.kubernetes.io/region":{},"f:topology.kubernetes.io/zone":{}}},"f:spec":{"f:providerID":{},"f:unschedulable":{}}} } {kube-controller-manager Update v1 2022-11-26 18:39:46 +0000 UTC FieldsV1 {"f:status":{"f:conditions":{"k:{\"type\":\"NetworkUnavailable\"}":{"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{}}}}} status} {kube-controller-manager Update v1 2022-11-26 18:39:49 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}},"f:spec":{"f:podCIDR":{},"f:podCIDRs":{".":{},"v:\"10.64.0.0/24\"":{}},"f:taints":{}}} } {kubelet Update v1 2022-11-26 18:45:48 +0000 UTC FieldsV1 {"f:status":{"f:conditions":{"k:{\"type\":\"DiskPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"MemoryPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"PIDPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"Ready\"}":{"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{}}},"f:images":{}}} status}]},Spec:NodeSpec{PodCIDR:10.64.0.0/24,DoNotUseExternalID:,ProviderID:gce://k8s-jkns-e2e-gke-ubuntu-slow/us-west1-b/bootstrap-e2e-master,Unschedulable:true,Taints:[]Taint{Taint{Key:node-role.kubernetes.io/master,Value:,Effect:NoSchedule,TimeAdded:<nil>,},Taint{Key:node.kubernetes.io/unschedulable,Value:,Effect:NoSchedule,TimeAdded:<nil>,},},ConfigSource:nil,PodCIDRs:[10.64.0.0/24],},Status:NodeStatus{Capacity:ResourceList{attachable-volumes-gce-pd: {{127 0} {<nil>} 127 DecimalSI},cpu: {{1 0} {<nil>} 1 DecimalSI},ephemeral-storage: {{16656896000 0} {<nil>} 16266500Ki BinarySI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{3858374656 0} {<nil>} BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Allocatable:ResourceList{attachable-volumes-gce-pd: {{127 0} {<nil>} 127 DecimalSI},cpu: {{1 0} {<nil>} 1 DecimalSI},ephemeral-storage: {{14991206376 0} {<nil>} 14991206376 DecimalSI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{3596230656 0} {<nil>} BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Phase:,Conditions:[]NodeCondition{NodeCondition{Type:NetworkUnavailable,Status:False,LastHeartbeatTime:2022-11-26 18:39:46 +0000 UTC,LastTransitionTime:2022-11-26 18:39:46 +0000 UTC,Reason:RouteCreated,Message:RouteController created a route,},NodeCondition{Type:MemoryPressure,Status:False,LastHeartbeatTime:2022-11-26 18:45:48 +0000 UTC,LastTransitionTime:2022-11-26 18:39:29 +0000 UTC,Reason:KubeletHasSufficientMemory,Message:kubelet has sufficient memory available,},NodeCondition{Type:DiskPressure,Status:False,LastHeartbeatTime:2022-11-26 18:45:48 +0000 UTC,LastTransitionTime:2022-11-26 18:39:29 +0000 UTC,Reason:KubeletHasNoDiskPressure,Message:kubelet has no disk pressure,},NodeCondition{Type:PIDPressure,Status:False,LastHeartbeatTime:2022-11-26 18:45:48 +0000 UTC,LastTransitionTime:2022-11-26 18:39:29 +0000 UTC,Reason:KubeletHasSufficientPID,Message:kubelet has sufficient PID available,},NodeCondition{Type:Ready,Status:True,LastHeartbeatTime:2022-11-26 18:45:48 +0000 UTC,LastTransitionTime:2022-11-26 18:39:48 +0000 UTC,Reason:KubeletReady,Message:kubelet is posting ready status. AppArmor enabled,},},Addresses:[]NodeAddress{NodeAddress{Type:InternalIP,Address:10.138.0.2,},NodeAddress{Type:ExternalIP,Address:34.83.88.61,},NodeAddress{Type:InternalDNS,Address:bootstrap-e2e-master.c.k8s-jkns-e2e-gke-ubuntu-slow.internal,},NodeAddress{Type:Hostname,Address:bootstrap-e2e-master.c.k8s-jkns-e2e-gke-ubuntu-slow.internal,},},DaemonEndpoints:NodeDaemonEndpoints{KubeletEndpoint:DaemonEndpoint{Port:10250,},},NodeInfo:NodeSystemInfo{MachineID:3cf1b58d8a4ce7f42c27588b99e07e59,SystemUUID:3cf1b58d-8a4c-e7f4-2c27-588b99e07e59,BootID:5ae5a72a-a603-4ca4-8f28-2f5024cd1d88,KernelVersion:5.10.123+,OSImage:Container-Optimized OS from Google,ContainerRuntimeVersion:containerd://1.7.0-beta.0-149-gd06318622,KubeletVersion:v1.27.0-alpha.0.50+70617042976dc1,KubeProxyVersion:v1.27.0-alpha.0.50+70617042976dc1,OperatingSystem:linux,Architecture:amd64,},Images:[]ContainerImage{ContainerImage{Names:[registry.k8s.io/kube-apiserver-amd64:v1.27.0-alpha.0.50_70617042976dc1],SizeBytes:135160272,},ContainerImage{Names:[registry.k8s.io/kube-controller-manager-amd64:v1.27.0-alpha.0.50_70617042976dc1],SizeBytes:124990265,},ContainerImage{Names:[registry.k8s.io/etcd@sha256:dd75ec974b0a2a6f6bb47001ba09207976e625db898d1b16735528c009cb171c registry.k8s.io/etcd:3.5.6-0],SizeBytes:102542580,},ContainerImage{Names:[registry.k8s.io/kube-scheduler-amd64:v1.27.0-alpha.0.50_70617042976dc1],SizeBytes:57660216,},ContainerImage{Names:[gke.gcr.io/prometheus-to-sd@sha256:e739643c3939ba0b161425f45a1989eedfc4a3b166db9a7100863296b4c70510 gke.gcr.io/prometheus-to-sd:v0.11.1-gke.1],SizeBytes:48742566,},ContainerImage{Names:[gcr.io/k8s-ingress-image-push/ingress-gce-glbc-amd64@sha256:5db27383add6d9f4ebdf0286409ac31f7f5d273690204b341a4e37998917693b gcr.io/k8s-ingress-image-push/ingress-gce-glbc-amd64:v1.20.1],SizeBytes:36598135,},ContainerImage{Names:[registry.k8s.io/addon-manager/kube-addon-manager@sha256:49cc4e6e4a3745b427ce14b0141476ab339bb65c6bc05033019e046c8727dcb0 registry.k8s.io/addon-manager/kube-addon-manager:v9.1.6],SizeBytes:30464183,},ContainerImage{Names:[registry.k8s.io/kas-network-proxy/proxy-server@sha256:2c111f004bec24888d8cfa2a812a38fb8341350abac67dcd0ac64e709dfe389c registry.k8s.io/kas-network-proxy/proxy-server:v0.0.33],SizeBytes:22020129,},ContainerImage{Names:[registry.k8s.io/metadata-proxy@sha256:e914645f22e946bce5165737e1b244e0a296ad1f0f81a9531adc57af2780978a registry.k8s.io/metadata-proxy:v0.1.12],SizeBytes:5301657,},ContainerImage{Names:[registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d registry.k8s.io/pause:3.8],SizeBytes:311286,},},VolumesInUse:[],VolumesAttached:[]AttachedVolume{},Config:nil,},} Nov 26 18:50:41.814: INFO: Logging kubelet events for node bootstrap-e2e-master Nov 26 18:50:41.859: INFO: Logging pods the kubelet thinks is on node bootstrap-e2e-master Nov 26 18:50:41.926: INFO: etcd-server-bootstrap-e2e-master started at 2022-11-26 18:38:46 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:41.926: INFO: Container etcd-container ready: true, restart count 2 Nov 26 18:50:41.926: INFO: l7-lb-controller-bootstrap-e2e-master started at 2022-11-26 18:39:03 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:41.926: INFO: Container l7-lb-controller ready: false, restart count 5 Nov 26 18:50:41.926: INFO: etcd-server-events-bootstrap-e2e-master started at 2022-11-26 18:38:46 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:41.926: INFO: Container etcd-container ready: true, restart count 0 Nov 26 18:50:41.926: INFO: kube-apiserver-bootstrap-e2e-master started at 2022-11-26 18:38:46 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:41.926: INFO: Container kube-apiserver ready: true, restart count 2 Nov 26 18:50:41.926: INFO: kube-controller-manager-bootstrap-e2e-master started at 2022-11-26 18:38:46 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:41.926: INFO: Container kube-controller-manager ready: false, restart count 4 Nov 26 18:50:41.926: INFO: kube-scheduler-bootstrap-e2e-master started at 2022-11-26 18:38:46 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:41.926: INFO: Container kube-scheduler ready: true, restart count 3 Nov 26 18:50:41.926: INFO: kube-addon-manager-bootstrap-e2e-master started at 2022-11-26 18:39:03 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:41.926: INFO: Container kube-addon-manager ready: false, restart count 2 Nov 26 18:50:41.926: INFO: metadata-proxy-v0.1-svjxn started at 2022-11-26 18:39:48 +0000 UTC (0+2 container statuses recorded) Nov 26 18:50:41.926: INFO: Container metadata-proxy ready: true, restart count 0 Nov 26 18:50:41.926: INFO: Container prometheus-to-sd-exporter ready: true, restart count 0 Nov 26 18:50:41.926: INFO: konnectivity-server-bootstrap-e2e-master started at 2022-11-26 18:38:46 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:41.926: INFO: Container konnectivity-server-container ready: true, restart count 3 Nov 26 18:50:42.108: INFO: Latency metrics for node bootstrap-e2e-master Nov 26 18:50:42.108: INFO: Logging node info for node bootstrap-e2e-minion-group-dzls Nov 26 18:50:42.149: INFO: Node Info: &Node{ObjectMeta:{bootstrap-e2e-minion-group-dzls 073571af-ef8e-4d0a-9070-5398dab26550 6842 0 2022-11-26 18:39:33 +0000 UTC <nil> <nil> map[beta.kubernetes.io/arch:amd64 beta.kubernetes.io/instance-type:n1-standard-2 beta.kubernetes.io/os:linux cloud.google.com/metadata-proxy-ready:true failure-domain.beta.kubernetes.io/region:us-west1 failure-domain.beta.kubernetes.io/zone:us-west1-b kubernetes.io/arch:amd64 kubernetes.io/hostname:bootstrap-e2e-minion-group-dzls kubernetes.io/os:linux node.kubernetes.io/instance-type:n1-standard-2 topology.hostpath.csi/node:bootstrap-e2e-minion-group-dzls topology.kubernetes.io/region:us-west1 topology.kubernetes.io/zone:us-west1-b] map[csi.volume.kubernetes.io/nodeid:{"csi-mock-csi-mock-volumes-2237":"bootstrap-e2e-minion-group-dzls"} node.alpha.kubernetes.io/ttl:0 volumes.kubernetes.io/controller-managed-attach-detach:true] [] [] [{kube-controller-manager Update v1 2022-11-26 18:39:33 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}},"f:spec":{"f:podCIDR":{},"f:podCIDRs":{".":{},"v:\"10.64.3.0/24\"":{}}}} } {kubelet Update v1 2022-11-26 18:39:33 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-attach-detach":{}},"f:labels":{".":{},"f:beta.kubernetes.io/arch":{},"f:beta.kubernetes.io/instance-type":{},"f:beta.kubernetes.io/os":{},"f:cloud.google.com/metadata-proxy-ready":{},"f:failure-domain.beta.kubernetes.io/region":{},"f:failure-domain.beta.kubernetes.io/zone":{},"f:kubernetes.io/arch":{},"f:kubernetes.io/hostname":{},"f:kubernetes.io/os":{},"f:node.kubernetes.io/instance-type":{},"f:topology.kubernetes.io/region":{},"f:topology.kubernetes.io/zone":{}}},"f:spec":{"f:providerID":{}}} } {kube-controller-manager Update v1 2022-11-26 18:44:32 +0000 UTC FieldsV1 {"f:status":{"f:conditions":{"k:{\"type\":\"NetworkUnavailable\"}":{"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{}}}}} status} {kubelet Update v1 2022-11-26 18:50:17 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:csi.volume.kubernetes.io/nodeid":{}},"f:labels":{"f:topology.hostpath.csi/node":{}}},"f:status":{"f:conditions":{"k:{\"type\":\"DiskPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"MemoryPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"PIDPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"Ready\"}":{"f:lastHeartbeatTime":{},"f:message":{},"f:reason":{},"f:status":{}}},"f:images":{}}} status} {node-problem-detector Update v1 2022-11-26 18:50:21 +0000 UTC FieldsV1 {"f:status":{"f:conditions":{"k:{\"type\":\"CorruptDockerOverlay2\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentContainerdRestart\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentDockerRestart\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentKubeletRestart\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentUnregisterNetDevice\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"KernelDeadlock\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"ReadonlyFilesystem\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}}}}} status}]},Spec:NodeSpec{PodCIDR:10.64.3.0/24,DoNotUseExternalID:,ProviderID:gce://k8s-jkns-e2e-gke-ubuntu-slow/us-west1-b/bootstrap-e2e-minion-group-dzls,Unschedulable:false,Taints:[]Taint{},ConfigSource:nil,PodCIDRs:[10.64.3.0/24],},Status:NodeStatus{Capacity:ResourceList{attachable-volumes-gce-pd: {{127 0} {<nil>} 127 DecimalSI},cpu: {{2 0} {<nil>} 2 DecimalSI},ephemeral-storage: {{101203873792 0} {<nil>} 98831908Ki BinarySI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{7815438336 0} {<nil>} BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Allocatable:ResourceList{attachable-volumes-gce-pd: {{127 0} {<nil>} 127 DecimalSI},cpu: {{2 0} {<nil>} 2 DecimalSI},ephemeral-storage: {{91083486262 0} {<nil>} 91083486262 DecimalSI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{7553294336 0} {<nil>} BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Phase:,Conditions:[]NodeCondition{NodeCondition{Type:ReadonlyFilesystem,Status:False,LastHeartbeatTime:2022-11-26 18:50:21 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:FilesystemIsNotReadOnly,Message:Filesystem is not read-only,},NodeCondition{Type:CorruptDockerOverlay2,Status:False,LastHeartbeatTime:2022-11-26 18:50:21 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoCorruptDockerOverlay2,Message:docker overlay2 is functioning properly,},NodeCondition{Type:FrequentUnregisterNetDevice,Status:False,LastHeartbeatTime:2022-11-26 18:50:21 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentUnregisterNetDevice,Message:node is functioning properly,},NodeCondition{Type:FrequentKubeletRestart,Status:False,LastHeartbeatTime:2022-11-26 18:50:21 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentKubeletRestart,Message:kubelet is functioning properly,},NodeCondition{Type:FrequentDockerRestart,Status:False,LastHeartbeatTime:2022-11-26 18:50:21 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentDockerRestart,Message:docker is functioning properly,},NodeCondition{Type:FrequentContainerdRestart,Status:False,LastHeartbeatTime:2022-11-26 18:50:21 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentContainerdRestart,Message:containerd is functioning properly,},NodeCondition{Type:KernelDeadlock,Status:False,LastHeartbeatTime:2022-11-26 18:50:21 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:KernelHasNoDeadlock,Message:kernel has no deadlock,},NodeCondition{Type:NetworkUnavailable,Status:False,LastHeartbeatTime:2022-11-26 18:39:46 +0000 UTC,LastTransitionTime:2022-11-26 18:39:46 +0000 UTC,Reason:RouteCreated,Message:RouteController created a route,},NodeCondition{Type:MemoryPressure,Status:False,LastHeartbeatTime:2022-11-26 18:46:41 +0000 UTC,LastTransitionTime:2022-11-26 18:39:33 +0000 UTC,Reason:KubeletHasSufficientMemory,Message:kubelet has sufficient memory available,},NodeCondition{Type:DiskPressure,Status:False,LastHeartbeatTime:2022-11-26 18:46:41 +0000 UTC,LastTransitionTime:2022-11-26 18:39:33 +0000 UTC,Reason:KubeletHasNoDiskPressure,Message:kubelet has no disk pressure,},NodeCondition{Type:PIDPressure,Status:False,LastHeartbeatTime:2022-11-26 18:46:41 +0000 UTC,LastTransitionTime:2022-11-26 18:39:33 +0000 UTC,Reason:KubeletHasSufficientPID,Message:kubelet has sufficient PID available,},NodeCondition{Type:Ready,Status:True,LastHeartbeatTime:2022-11-26 18:46:41 +0000 UTC,LastTransitionTime:2022-11-26 18:39:33 +0000 UTC,Reason:KubeletReady,Message:kubelet is posting ready status. AppArmor enabled,},},Addresses:[]NodeAddress{NodeAddress{Type:InternalIP,Address:10.138.0.5,},NodeAddress{Type:ExternalIP,Address:34.168.240.199,},NodeAddress{Type:InternalDNS,Address:bootstrap-e2e-minion-group-dzls.c.k8s-jkns-e2e-gke-ubuntu-slow.internal,},NodeAddress{Type:Hostname,Address:bootstrap-e2e-minion-group-dzls.c.k8s-jkns-e2e-gke-ubuntu-slow.internal,},},DaemonEndpoints:NodeDaemonEndpoints{KubeletEndpoint:DaemonEndpoint{Port:10250,},},NodeInfo:NodeSystemInfo{MachineID:8acf528984519ae32bb25015f71e5ef3,SystemUUID:8acf5289-8451-9ae3-2bb2-5015f71e5ef3,BootID:5e47f3ec-cd63-4220-9ac4-8feb7752a952,KernelVersion:5.10.123+,OSImage:Container-Optimized OS from Google,ContainerRuntimeVersion:containerd://1.7.0-beta.0-149-gd06318622,KubeletVersion:v1.27.0-alpha.0.50+70617042976dc1,KubeProxyVersion:v1.27.0-alpha.0.50+70617042976dc1,OperatingSystem:linux,Architecture:amd64,},Images:[]ContainerImage{ContainerImage{Names:[registry.k8s.io/e2e-test-images/jessie-dnsutils@sha256:24aaf2626d6b27864c29de2097e8bbb840b3a414271bf7c8995e431e47d8408e registry.k8s.io/e2e-test-images/jessie-dnsutils:1.7],SizeBytes:112030336,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/volume/nfs@sha256:3bda73f2428522b0e342af80a0b9679e8594c2126f2b3cca39ed787589741b9e registry.k8s.io/e2e-test-images/volume/nfs:1.3],SizeBytes:95836203,},ContainerImage{Names:[registry.k8s.io/sig-storage/nfs-provisioner@sha256:e943bb77c7df05ebdc8c7888b2db289b13bf9f012d6a3a5a74f14d4d5743d439 registry.k8s.io/sig-storage/nfs-provisioner:v3.0.1],SizeBytes:90632047,},ContainerImage{Names:[registry.k8s.io/kube-proxy-amd64:v1.27.0-alpha.0.50_70617042976dc1],SizeBytes:67201736,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/agnhost@sha256:16bbf38c463a4223d8cfe4da12bc61010b082a79b4bb003e2d3ba3ece5dd5f9e registry.k8s.io/e2e-test-images/agnhost:2.43],SizeBytes:51706353,},ContainerImage{Names:[gke.gcr.io/prometheus-to-sd@sha256:e739643c3939ba0b161425f45a1989eedfc4a3b166db9a7100863296b4c70510 gke.gcr.io/prometheus-to-sd:v0.11.1-gke.1],SizeBytes:48742566,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/httpd@sha256:148b022f5c5da426fc2f3c14b5c0867e58ef05961510c84749ac1fddcb0fef22 registry.k8s.io/e2e-test-images/httpd:2.4.38-4],SizeBytes:40764257,},ContainerImage{Names:[registry.k8s.io/metrics-server/metrics-server@sha256:6385aec64bb97040a5e692947107b81e178555c7a5b71caa90d733e4130efc10 registry.k8s.io/metrics-server/metrics-server:v0.5.2],SizeBytes:26023008,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-provisioner@sha256:ee3b525d5b89db99da3b8eb521d9cd90cb6e9ef0fbb651e98bb37be78d36b5b8 registry.k8s.io/sig-storage/csi-provisioner:v3.3.0],SizeBytes:25491225,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-attacher@sha256:9a685020911e2725ad019dbce6e4a5ab93d51e3d4557f115e64343345e05781b registry.k8s.io/sig-storage/csi-attacher:v4.0.0],SizeBytes:23847201,},ContainerImage{Names:[registry.k8s.io/sig-storage/snapshot-controller@sha256:823c75d0c45d1427f6d850070956d9ca657140a7bbf828381541d1d808475280 registry.k8s.io/sig-storage/snapshot-controller:v6.1.0],SizeBytes:22620891,},ContainerImage{Names:[registry.k8s.io/sig-storage/hostpathplugin@sha256:92257881c1d6493cf18299a24af42330f891166560047902b8d431fb66b01af5 registry.k8s.io/sig-storage/hostpathplugin:v1.9.0],SizeBytes:18758628,},ContainerImage{Names:[registry.k8s.io/cpa/cluster-proportional-autoscaler@sha256:fd636b33485c7826fb20ef0688a83ee0910317dbb6c0c6f3ad14661c1db25def registry.k8s.io/cpa/cluster-proportional-autoscaler:1.8.4],SizeBytes:15209393,},ContainerImage{Names:[registry.k8s.io/coredns/coredns@sha256:8e352a029d304ca7431c6507b56800636c321cb52289686a581ab70aaa8a2e2a registry.k8s.io/coredns/coredns:v1.9.3],SizeBytes:14837849,},ContainerImage{Names:[registry.k8s.io/autoscaling/addon-resizer@sha256:43f129b81d28f0fdd54de6d8e7eacd5728030782e03db16087fc241ad747d3d6 registry.k8s.io/autoscaling/addon-resizer:1.8.14],SizeBytes:10153852,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-node-driver-registrar@sha256:0103eee7c35e3e0b5cd8cdca9850dc71c793cdeb6669d8be7a89440da2d06ae4 registry.k8s.io/sig-storage/csi-node-driver-registrar:v2.5.1],SizeBytes:9133109,},ContainerImage{Names:[registry.k8s.io/kas-network-proxy/proxy-agent@sha256:48f2a4ec3e10553a81b8dd1c6fa5fe4bcc9617f78e71c1ca89c6921335e2d7da registry.k8s.io/kas-network-proxy/proxy-agent:v0.0.33],SizeBytes:8512162,},ContainerImage{Names:[registry.k8s.io/networking/ingress-gce-404-server-with-metrics-amd64@sha256:7eb7b3cee4d33c10c49893ad3c386232b86d4067de5251294d4c620d6e072b93 registry.k8s.io/networking/ingress-gce-404-server-with-metrics-amd64:v1.10.11],SizeBytes:6463068,},ContainerImage{Names:[registry.k8s.io/metadata-proxy@sha256:e914645f22e946bce5165737e1b244e0a296ad1f0f81a9531adc57af2780978a registry.k8s.io/metadata-proxy:v0.1.12],SizeBytes:5301657,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/busybox@sha256:2e0f836850e09b8b7cc937681d6194537a09fbd5f6b9e08f4d646a85128e8937 registry.k8s.io/e2e-test-images/busybox:1.29-4],SizeBytes:731990,},ContainerImage{Names:[registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097 registry.k8s.io/pause:3.9],SizeBytes:321520,},ContainerImage{Names:[registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d registry.k8s.io/pause:3.8],SizeBytes:311286,},},VolumesInUse:[],VolumesAttached:[]AttachedVolume{},Config:nil,},} Nov 26 18:50:42.150: INFO: Logging kubelet events for node bootstrap-e2e-minion-group-dzls Nov 26 18:50:42.204: INFO: Logging pods the kubelet thinks is on node bootstrap-e2e-minion-group-dzls Nov 26 18:50:42.303: INFO: test-container-pod started at 2022-11-26 18:45:22 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.303: INFO: Container webserver ready: true, restart count 0 Nov 26 18:50:42.303: INFO: test-container-pod started at 2022-11-26 18:48:19 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.303: INFO: Container webserver ready: true, restart count 3 Nov 26 18:50:42.303: INFO: csi-mockplugin-attacher-0 started at 2022-11-26 18:49:09 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.303: INFO: Container csi-attacher ready: false, restart count 0 Nov 26 18:50:42.303: INFO: lb-internal-fmnnv started at 2022-11-26 18:47:14 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.303: INFO: Container netexec ready: true, restart count 2 Nov 26 18:50:42.303: INFO: kube-proxy-bootstrap-e2e-minion-group-dzls started at 2022-11-26 18:39:33 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.303: INFO: Container kube-proxy ready: true, restart count 6 Nov 26 18:50:42.303: INFO: csi-mockplugin-0 started at 2022-11-26 18:44:05 +0000 UTC (0+3 container statuses recorded) Nov 26 18:50:42.303: INFO: Container csi-provisioner ready: false, restart count 3 Nov 26 18:50:42.303: INFO: Container driver-registrar ready: false, restart count 3 Nov 26 18:50:42.303: INFO: Container mock ready: false, restart count 3 Nov 26 18:50:42.303: INFO: pvc-volume-tester-9phjk started at 2022-11-26 18:48:38 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.303: INFO: Container volume-tester ready: false, restart count 0 Nov 26 18:50:42.303: INFO: l7-default-backend-8549d69d99-wtn4d started at 2022-11-26 18:39:46 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.303: INFO: Container default-http-backend ready: true, restart count 0 Nov 26 18:50:42.303: INFO: affinity-lb-esipp-transition-brzbx started at 2022-11-26 18:45:17 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.303: INFO: Container affinity-lb-esipp-transition ready: true, restart count 1 Nov 26 18:50:42.303: INFO: pvc-volume-tester-nj8zb started at 2022-11-26 18:44:17 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.303: INFO: Container volume-tester ready: false, restart count 0 Nov 26 18:50:42.303: INFO: csi-mockplugin-0 started at 2022-11-26 18:48:36 +0000 UTC (0+3 container statuses recorded) Nov 26 18:50:42.303: INFO: Container csi-provisioner ready: true, restart count 3 Nov 26 18:50:42.303: INFO: Container driver-registrar ready: true, restart count 3 Nov 26 18:50:42.303: INFO: Container mock ready: true, restart count 3 Nov 26 18:50:42.303: INFO: execpod-acceptwm2nm started at 2022-11-26 18:41:50 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.303: INFO: Container agnhost-container ready: true, restart count 1 Nov 26 18:50:42.303: INFO: lb-sourcerange-b4r2c started at 2022-11-26 18:42:06 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.303: INFO: Container netexec ready: true, restart count 3 Nov 26 18:50:42.303: INFO: mutability-test-26zlk started at 2022-11-26 18:47:41 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.303: INFO: Container netexec ready: true, restart count 2 Nov 26 18:50:42.303: INFO: metadata-proxy-v0.1-zcpfg started at 2022-11-26 18:39:33 +0000 UTC (0+2 container statuses recorded) Nov 26 18:50:42.303: INFO: Container metadata-proxy ready: true, restart count 0 Nov 26 18:50:42.303: INFO: Container prometheus-to-sd-exporter ready: true, restart count 0 Nov 26 18:50:42.303: INFO: konnectivity-agent-jfgxg started at 2022-11-26 18:39:46 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.303: INFO: Container konnectivity-agent ready: false, restart count 5 Nov 26 18:50:42.303: INFO: netserver-0 started at 2022-11-26 18:46:43 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.303: INFO: Container webserver ready: true, restart count 3 Nov 26 18:50:42.303: INFO: pod-back-off-image started at 2022-11-26 18:48:36 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.303: INFO: Container back-off ready: true, restart count 4 Nov 26 18:50:42.303: INFO: coredns-6d97d5ddb-l6khp started at 2022-11-26 18:39:46 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.303: INFO: Container coredns ready: false, restart count 6 Nov 26 18:50:42.303: INFO: volume-snapshot-controller-0 started at 2022-11-26 18:39:46 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.303: INFO: Container volume-snapshot-controller ready: true, restart count 4 Nov 26 18:50:42.303: INFO: external-local-nodes-z9qgt started at 2022-11-26 18:45:27 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.303: INFO: Container netexec ready: false, restart count 5 Nov 26 18:50:42.303: INFO: nfs-server started at 2022-11-26 18:46:10 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.303: INFO: Container nfs-server ready: true, restart count 2 Nov 26 18:50:42.303: INFO: pvc-volume-tester-ptk74 started at 2022-11-26 18:44:26 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.303: INFO: Container volume-tester ready: false, restart count 0 Nov 26 18:50:42.303: INFO: csi-mockplugin-0 started at 2022-11-26 18:49:09 +0000 UTC (0+3 container statuses recorded) Nov 26 18:50:42.303: INFO: Container csi-provisioner ready: false, restart count 0 Nov 26 18:50:42.303: INFO: Container driver-registrar ready: false, restart count 0 Nov 26 18:50:42.303: INFO: Container mock ready: false, restart count 0 Nov 26 18:50:42.303: INFO: kube-dns-autoscaler-5f6455f985-rcp87 started at 2022-11-26 18:39:46 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.303: INFO: Container autoscaler ready: false, restart count 5 Nov 26 18:50:42.303: INFO: net-tiers-svc-lp9r4 started at 2022-11-26 18:41:49 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.303: INFO: Container netexec ready: true, restart count 4 Nov 26 18:50:42.303: INFO: csi-mockplugin-attacher-0 started at 2022-11-26 18:44:06 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.303: INFO: Container csi-attacher ready: true, restart count 3 Nov 26 18:50:42.303: INFO: netserver-0 started at 2022-11-26 18:44:09 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.303: INFO: Container webserver ready: true, restart count 4 Nov 26 18:50:42.303: INFO: external-local-update-nfqlg started at 2022-11-26 18:46:40 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.303: INFO: Container netexec ready: true, restart count 0 Nov 26 18:50:42.502: INFO: Latency metrics for node bootstrap-e2e-minion-group-dzls Nov 26 18:50:42.502: INFO: Logging node info for node bootstrap-e2e-minion-group-gnb8 Nov 26 18:50:42.543: INFO: Node Info: &Node{ObjectMeta:{bootstrap-e2e-minion-group-gnb8 d034f9b4-7cc6-4262-a4af-02040c7b7abd 6839 0 2022-11-26 18:39:33 +0000 UTC <nil> <nil> map[beta.kubernetes.io/arch:amd64 beta.kubernetes.io/instance-type:n1-standard-2 beta.kubernetes.io/os:linux cloud.google.com/metadata-proxy-ready:true failure-domain.beta.kubernetes.io/region:us-west1 failure-domain.beta.kubernetes.io/zone:us-west1-b kubernetes.io/arch:amd64 kubernetes.io/hostname:bootstrap-e2e-minion-group-gnb8 kubernetes.io/os:linux node.kubernetes.io/instance-type:n1-standard-2 topology.hostpath.csi/node:bootstrap-e2e-minion-group-gnb8 topology.kubernetes.io/region:us-west1 topology.kubernetes.io/zone:us-west1-b] map[csi.volume.kubernetes.io/nodeid:{"csi-hostpath-multivolume-3170":"bootstrap-e2e-minion-group-gnb8","csi-hostpath-provisioning-1566":"bootstrap-e2e-minion-group-gnb8","csi-mock-csi-mock-volumes-9058":"csi-mock-csi-mock-volumes-9058"} node.alpha.kubernetes.io/ttl:0 volumes.kubernetes.io/controller-managed-attach-detach:true] [] [] [{kube-controller-manager Update v1 2022-11-26 18:39:33 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}},"f:spec":{"f:podCIDR":{},"f:podCIDRs":{".":{},"v:\"10.64.2.0/24\"":{}}}} } {kubelet Update v1 2022-11-26 18:39:33 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-attach-detach":{}},"f:labels":{".":{},"f:beta.kubernetes.io/arch":{},"f:beta.kubernetes.io/instance-type":{},"f:beta.kubernetes.io/os":{},"f:cloud.google.com/metadata-proxy-ready":{},"f:failure-domain.beta.kubernetes.io/region":{},"f:failure-domain.beta.kubernetes.io/zone":{},"f:kubernetes.io/arch":{},"f:kubernetes.io/hostname":{},"f:kubernetes.io/os":{},"f:node.kubernetes.io/instance-type":{},"f:topology.kubernetes.io/region":{},"f:topology.kubernetes.io/zone":{}}},"f:spec":{"f:providerID":{}}} } {kube-controller-manager Update v1 2022-11-26 18:45:11 +0000 UTC FieldsV1 {"f:status":{"f:conditions":{"k:{\"type\":\"NetworkUnavailable\"}":{"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{}}}}} status} {kubelet Update v1 2022-11-26 18:50:17 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:csi.volume.kubernetes.io/nodeid":{}},"f:labels":{"f:topology.hostpath.csi/node":{}}},"f:status":{"f:conditions":{"k:{\"type\":\"DiskPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"MemoryPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"PIDPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"Ready\"}":{"f:lastHeartbeatTime":{},"f:message":{},"f:reason":{},"f:status":{}}},"f:images":{}}} status} {node-problem-detector Update v1 2022-11-26 18:50:21 +0000 UTC FieldsV1 {"f:status":{"f:conditions":{"k:{\"type\":\"CorruptDockerOverlay2\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentContainerdRestart\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentDockerRestart\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentKubeletRestart\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentUnregisterNetDevice\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"KernelDeadlock\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"ReadonlyFilesystem\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}}}}} status}]},Spec:NodeSpec{PodCIDR:10.64.2.0/24,DoNotUseExternalID:,ProviderID:gce://k8s-jkns-e2e-gke-ubuntu-slow/us-west1-b/bootstrap-e2e-minion-group-gnb8,Unschedulable:false,Taints:[]Taint{},ConfigSource:nil,PodCIDRs:[10.64.2.0/24],},Status:NodeStatus{Capacity:ResourceList{attachable-volumes-gce-pd: {{127 0} {<nil>} 127 DecimalSI},cpu: {{2 0} {<nil>} 2 DecimalSI},ephemeral-storage: {{101203873792 0} {<nil>} 98831908Ki BinarySI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{7815438336 0} {<nil>} BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Allocatable:ResourceList{attachable-volumes-gce-pd: {{127 0} {<nil>} 127 DecimalSI},cpu: {{2 0} {<nil>} 2 DecimalSI},ephemeral-storage: {{91083486262 0} {<nil>} 91083486262 DecimalSI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{7553294336 0} {<nil>} BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Phase:,Conditions:[]NodeCondition{NodeCondition{Type:FrequentDockerRestart,Status:False,LastHeartbeatTime:2022-11-26 18:50:21 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentDockerRestart,Message:docker is functioning properly,},NodeCondition{Type:FrequentContainerdRestart,Status:False,LastHeartbeatTime:2022-11-26 18:50:21 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentContainerdRestart,Message:containerd is functioning properly,},NodeCondition{Type:CorruptDockerOverlay2,Status:False,LastHeartbeatTime:2022-11-26 18:50:21 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoCorruptDockerOverlay2,Message:docker overlay2 is functioning properly,},NodeCondition{Type:KernelDeadlock,Status:False,LastHeartbeatTime:2022-11-26 18:50:21 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:KernelHasNoDeadlock,Message:kernel has no deadlock,},NodeCondition{Type:ReadonlyFilesystem,Status:False,LastHeartbeatTime:2022-11-26 18:50:21 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:FilesystemIsNotReadOnly,Message:Filesystem is not read-only,},NodeCondition{Type:FrequentUnregisterNetDevice,Status:False,LastHeartbeatTime:2022-11-26 18:50:21 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentUnregisterNetDevice,Message:node is functioning properly,},NodeCondition{Type:FrequentKubeletRestart,Status:False,LastHeartbeatTime:2022-11-26 18:50:21 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentKubeletRestart,Message:kubelet is functioning properly,},NodeCondition{Type:NetworkUnavailable,Status:False,LastHeartbeatTime:2022-11-26 18:39:46 +0000 UTC,LastTransitionTime:2022-11-26 18:39:46 +0000 UTC,Reason:RouteCreated,Message:RouteController created a route,},NodeCondition{Type:MemoryPressure,Status:False,LastHeartbeatTime:2022-11-26 18:47:04 +0000 UTC,LastTransitionTime:2022-11-26 18:39:33 +0000 UTC,Reason:KubeletHasSufficientMemory,Message:kubelet has sufficient memory available,},NodeCondition{Type:DiskPressure,Status:False,LastHeartbeatTime:2022-11-26 18:47:04 +0000 UTC,LastTransitionTime:2022-11-26 18:39:33 +0000 UTC,Reason:KubeletHasNoDiskPressure,Message:kubelet has no disk pressure,},NodeCondition{Type:PIDPressure,Status:False,LastHeartbeatTime:2022-11-26 18:47:04 +0000 UTC,LastTransitionTime:2022-11-26 18:39:33 +0000 UTC,Reason:KubeletHasSufficientPID,Message:kubelet has sufficient PID available,},NodeCondition{Type:Ready,Status:True,LastHeartbeatTime:2022-11-26 18:47:04 +0000 UTC,LastTransitionTime:2022-11-26 18:39:33 +0000 UTC,Reason:KubeletReady,Message:kubelet is posting ready status. AppArmor enabled,},},Addresses:[]NodeAddress{NodeAddress{Type:InternalIP,Address:10.138.0.4,},NodeAddress{Type:ExternalIP,Address:34.83.41.63,},NodeAddress{Type:InternalDNS,Address:bootstrap-e2e-minion-group-gnb8.c.k8s-jkns-e2e-gke-ubuntu-slow.internal,},NodeAddress{Type:Hostname,Address:bootstrap-e2e-minion-group-gnb8.c.k8s-jkns-e2e-gke-ubuntu-slow.internal,},},DaemonEndpoints:NodeDaemonEndpoints{KubeletEndpoint:DaemonEndpoint{Port:10250,},},NodeInfo:NodeSystemInfo{MachineID:975025d51ee8dddd080415bf76b18e9f,SystemUUID:975025d5-1ee8-dddd-0804-15bf76b18e9f,BootID:78ba7c41-cb6b-4d0e-9530-01248f3478cf,KernelVersion:5.10.123+,OSImage:Container-Optimized OS from Google,ContainerRuntimeVersion:containerd://1.7.0-beta.0-149-gd06318622,KubeletVersion:v1.27.0-alpha.0.50+70617042976dc1,KubeProxyVersion:v1.27.0-alpha.0.50+70617042976dc1,OperatingSystem:linux,Architecture:amd64,},Images:[]ContainerImage{ContainerImage{Names:[registry.k8s.io/e2e-test-images/jessie-dnsutils@sha256:24aaf2626d6b27864c29de2097e8bbb840b3a414271bf7c8995e431e47d8408e registry.k8s.io/e2e-test-images/jessie-dnsutils:1.7],SizeBytes:112030336,},ContainerImage{Names:[registry.k8s.io/sig-storage/nfs-provisioner@sha256:e943bb77c7df05ebdc8c7888b2db289b13bf9f012d6a3a5a74f14d4d5743d439 registry.k8s.io/sig-storage/nfs-provisioner:v3.0.1],SizeBytes:90632047,},ContainerImage{Names:[registry.k8s.io/kube-proxy-amd64:v1.27.0-alpha.0.50_70617042976dc1],SizeBytes:67201736,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/agnhost@sha256:16bbf38c463a4223d8cfe4da12bc61010b082a79b4bb003e2d3ba3ece5dd5f9e registry.k8s.io/e2e-test-images/agnhost:2.43],SizeBytes:51706353,},ContainerImage{Names:[gke.gcr.io/prometheus-to-sd@sha256:e739643c3939ba0b161425f45a1989eedfc4a3b166db9a7100863296b4c70510 gke.gcr.io/prometheus-to-sd:v0.11.1-gke.1],SizeBytes:48742566,},ContainerImage{Names:[registry.k8s.io/metrics-server/metrics-server@sha256:6385aec64bb97040a5e692947107b81e178555c7a5b71caa90d733e4130efc10 registry.k8s.io/metrics-server/metrics-server:v0.5.2],SizeBytes:26023008,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-provisioner@sha256:ee3b525d5b89db99da3b8eb521d9cd90cb6e9ef0fbb651e98bb37be78d36b5b8 registry.k8s.io/sig-storage/csi-provisioner:v3.3.0],SizeBytes:25491225,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-resizer@sha256:425d8f1b769398127767b06ed97ce62578a3179bcb99809ce93a1649e025ffe7 registry.k8s.io/sig-storage/csi-resizer:v1.6.0],SizeBytes:24148884,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-snapshotter@sha256:291334908ddf71a4661fd7f6d9d97274de8a5378a2b6fdfeb2ce73414a34f82f registry.k8s.io/sig-storage/csi-snapshotter:v6.1.0],SizeBytes:23881995,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-attacher@sha256:9a685020911e2725ad019dbce6e4a5ab93d51e3d4557f115e64343345e05781b registry.k8s.io/sig-storage/csi-attacher:v4.0.0],SizeBytes:23847201,},ContainerImage{Names:[registry.k8s.io/sig-storage/hostpathplugin@sha256:92257881c1d6493cf18299a24af42330f891166560047902b8d431fb66b01af5 registry.k8s.io/sig-storage/hostpathplugin:v1.9.0],SizeBytes:18758628,},ContainerImage{Names:[registry.k8s.io/autoscaling/addon-resizer@sha256:43f129b81d28f0fdd54de6d8e7eacd5728030782e03db16087fc241ad747d3d6 registry.k8s.io/autoscaling/addon-resizer:1.8.14],SizeBytes:10153852,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-node-driver-registrar@sha256:0103eee7c35e3e0b5cd8cdca9850dc71c793cdeb6669d8be7a89440da2d06ae4 registry.k8s.io/sig-storage/csi-node-driver-registrar:v2.5.1],SizeBytes:9133109,},ContainerImage{Names:[registry.k8s.io/sig-storage/livenessprobe@sha256:933940f13b3ea0abc62e656c1aa5c5b47c04b15d71250413a6b821bd0c58b94e registry.k8s.io/sig-storage/livenessprobe:v2.7.0],SizeBytes:8688564,},ContainerImage{Names:[registry.k8s.io/kas-network-proxy/proxy-agent@sha256:48f2a4ec3e10553a81b8dd1c6fa5fe4bcc9617f78e71c1ca89c6921335e2d7da registry.k8s.io/kas-network-proxy/proxy-agent:v0.0.33],SizeBytes:8512162,},ContainerImage{Names:[registry.k8s.io/metadata-proxy@sha256:e914645f22e946bce5165737e1b244e0a296ad1f0f81a9531adc57af2780978a registry.k8s.io/metadata-proxy:v0.1.12],SizeBytes:5301657,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/busybox@sha256:c318242786b139d18676b1c09a0ad7f15fc17f8f16a5b2e625cd0dc8c9703daf registry.k8s.io/e2e-test-images/busybox:1.29-2],SizeBytes:732424,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/busybox@sha256:2e0f836850e09b8b7cc937681d6194537a09fbd5f6b9e08f4d646a85128e8937 registry.k8s.io/e2e-test-images/busybox:1.29-4],SizeBytes:731990,},ContainerImage{Names:[registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097 registry.k8s.io/pause:3.9],SizeBytes:321520,},ContainerImage{Names:[registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d registry.k8s.io/pause:3.8],SizeBytes:311286,},},VolumesInUse:[],VolumesAttached:[]AttachedVolume{},Config:nil,},} Nov 26 18:50:42.544: INFO: Logging kubelet events for node bootstrap-e2e-minion-group-gnb8 Nov 26 18:50:42.593: INFO: Logging pods the kubelet thinks is on node bootstrap-e2e-minion-group-gnb8 Nov 26 18:50:42.653: INFO: metadata-proxy-v0.1-c77zk started at 2022-11-26 18:39:33 +0000 UTC (0+2 container statuses recorded) Nov 26 18:50:42.653: INFO: Container metadata-proxy ready: true, restart count 0 Nov 26 18:50:42.653: INFO: Container prometheus-to-sd-exporter ready: true, restart count 0 Nov 26 18:50:42.653: INFO: csi-hostpathplugin-0 started at 2022-11-26 18:44:47 +0000 UTC (0+7 container statuses recorded) Nov 26 18:50:42.653: INFO: Container csi-attacher ready: true, restart count 1 Nov 26 18:50:42.653: INFO: Container csi-provisioner ready: true, restart count 1 Nov 26 18:50:42.653: INFO: Container csi-resizer ready: true, restart count 1 Nov 26 18:50:42.653: INFO: Container csi-snapshotter ready: true, restart count 1 Nov 26 18:50:42.653: INFO: Container hostpath ready: true, restart count 1 Nov 26 18:50:42.653: INFO: Container liveness-probe ready: true, restart count 1 Nov 26 18:50:42.653: INFO: Container node-driver-registrar ready: true, restart count 3 Nov 26 18:50:42.653: INFO: csi-hostpathplugin-0 started at 2022-11-26 18:43:33 +0000 UTC (0+7 container statuses recorded) Nov 26 18:50:42.653: INFO: Container csi-attacher ready: true, restart count 4 Nov 26 18:50:42.653: INFO: Container csi-provisioner ready: true, restart count 4 Nov 26 18:50:42.653: INFO: Container csi-resizer ready: true, restart count 4 Nov 26 18:50:42.653: INFO: Container csi-snapshotter ready: true, restart count 4 Nov 26 18:50:42.653: INFO: Container hostpath ready: true, restart count 4 Nov 26 18:50:42.653: INFO: Container liveness-probe ready: true, restart count 4 Nov 26 18:50:42.653: INFO: Container node-driver-registrar ready: true, restart count 4 Nov 26 18:50:42.653: INFO: kube-proxy-bootstrap-e2e-minion-group-gnb8 started at 2022-11-26 18:39:33 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.653: INFO: Container kube-proxy ready: true, restart count 6 Nov 26 18:50:42.653: INFO: metrics-server-v0.5.2-867b8754b9-5tc55 started at 2022-11-26 18:40:04 +0000 UTC (0+2 container statuses recorded) Nov 26 18:50:42.653: INFO: Container metrics-server ready: false, restart count 7 Nov 26 18:50:42.653: INFO: Container metrics-server-nanny ready: false, restart count 6 Nov 26 18:50:42.653: INFO: netserver-1 started at 2022-11-26 18:44:09 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.653: INFO: Container webserver ready: true, restart count 4 Nov 26 18:50:42.653: INFO: affinity-lb-esipp-transition-qkrbm started at 2022-11-26 18:45:17 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.653: INFO: Container affinity-lb-esipp-transition ready: true, restart count 0 Nov 26 18:50:42.653: INFO: netserver-1 started at 2022-11-26 18:46:43 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.653: INFO: Container webserver ready: true, restart count 1 Nov 26 18:50:42.653: INFO: konnectivity-agent-rlpjb started at 2022-11-26 18:39:46 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.653: INFO: Container konnectivity-agent ready: false, restart count 5 Nov 26 18:50:42.653: INFO: csi-mockplugin-0 started at 2022-11-26 18:46:01 +0000 UTC (0+4 container statuses recorded) Nov 26 18:50:42.653: INFO: Container busybox ready: true, restart count 1 Nov 26 18:50:42.653: INFO: Container csi-provisioner ready: true, restart count 2 Nov 26 18:50:42.653: INFO: Container driver-registrar ready: true, restart count 1 Nov 26 18:50:42.653: INFO: Container mock ready: true, restart count 1 Nov 26 18:50:42.653: INFO: csi-hostpathplugin-0 started at 2022-11-26 18:44:49 +0000 UTC (0+7 container statuses recorded) Nov 26 18:50:42.653: INFO: Container csi-attacher ready: false, restart count 4 Nov 26 18:50:42.653: INFO: Container csi-provisioner ready: false, restart count 4 Nov 26 18:50:42.653: INFO: Container csi-resizer ready: false, restart count 4 Nov 26 18:50:42.653: INFO: Container csi-snapshotter ready: false, restart count 4 Nov 26 18:50:42.653: INFO: Container hostpath ready: false, restart count 4 Nov 26 18:50:42.653: INFO: Container liveness-probe ready: false, restart count 4 Nov 26 18:50:42.653: INFO: Container node-driver-registrar ready: false, restart count 4 Nov 26 18:50:42.653: INFO: hostexec-bootstrap-e2e-minion-group-gnb8-dk56q started at 2022-11-26 18:48:48 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.653: INFO: Container agnhost-container ready: true, restart count 2 Nov 26 18:50:42.653: INFO: pod-105898ad-3540-4a0c-84c6-e8ce393b8408 started at 2022-11-26 18:49:08 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.653: INFO: Container write-pod ready: true, restart count 0 Nov 26 18:50:42.653: INFO: execpod-dropfkgh8 started at 2022-11-26 18:41:56 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:42.653: INFO: Container agnhost-container ready: false, restart count 3 Nov 26 18:50:42.881: INFO: Latency metrics for node bootstrap-e2e-minion-group-gnb8 Nov 26 18:50:42.881: INFO: Logging node info for node bootstrap-e2e-minion-group-p1wq Nov 26 18:50:42.923: INFO: Node Info: &Node{ObjectMeta:{bootstrap-e2e-minion-group-p1wq e4a1a2e4-c0df-40e4-9f3f-bc50b8a3f74f 6858 0 2022-11-26 18:39:32 +0000 UTC <nil> <nil> map[beta.kubernetes.io/arch:amd64 beta.kubernetes.io/instance-type:n1-standard-2 beta.kubernetes.io/os:linux cloud.google.com/metadata-proxy-ready:true failure-domain.beta.kubernetes.io/region:us-west1 failure-domain.beta.kubernetes.io/zone:us-west1-b kubernetes.io/arch:amd64 kubernetes.io/hostname:bootstrap-e2e-minion-group-p1wq kubernetes.io/os:linux node.kubernetes.io/instance-type:n1-standard-2 topology.hostpath.csi/node:bootstrap-e2e-minion-group-p1wq topology.kubernetes.io/region:us-west1 topology.kubernetes.io/zone:us-west1-b] map[csi.volume.kubernetes.io/nodeid:{"csi-hostpath-provisioning-8805":"bootstrap-e2e-minion-group-p1wq"} node.alpha.kubernetes.io/ttl:0 volumes.kubernetes.io/controller-managed-attach-detach:true] [] [] [{kubelet Update v1 2022-11-26 18:39:32 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-attach-detach":{}},"f:labels":{".":{},"f:beta.kubernetes.io/arch":{},"f:beta.kubernetes.io/instance-type":{},"f:beta.kubernetes.io/os":{},"f:cloud.google.com/metadata-proxy-ready":{},"f:failure-domain.beta.kubernetes.io/region":{},"f:failure-domain.beta.kubernetes.io/zone":{},"f:kubernetes.io/arch":{},"f:kubernetes.io/hostname":{},"f:kubernetes.io/os":{},"f:node.kubernetes.io/instance-type":{},"f:topology.kubernetes.io/region":{},"f:topology.kubernetes.io/zone":{}}},"f:spec":{"f:providerID":{}}} } {kube-controller-manager Update v1 2022-11-26 18:39:33 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}},"f:spec":{"f:podCIDR":{},"f:podCIDRs":{".":{},"v:\"10.64.1.0/24\"":{}}}} } {kube-controller-manager Update v1 2022-11-26 18:47:00 +0000 UTC FieldsV1 {"f:status":{"f:conditions":{"k:{\"type\":\"NetworkUnavailable\"}":{"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{}}}}} status} {node-problem-detector Update v1 2022-11-26 18:50:20 +0000 UTC FieldsV1 {"f:status":{"f:conditions":{"k:{\"type\":\"CorruptDockerOverlay2\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentContainerdRestart\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentDockerRestart\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentKubeletRestart\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"FrequentUnregisterNetDevice\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"KernelDeadlock\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"ReadonlyFilesystem\"}":{".":{},"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}}}}} status} {kubelet Update v1 2022-11-26 18:50:29 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:csi.volume.kubernetes.io/nodeid":{}},"f:labels":{"f:topology.hostpath.csi/node":{}}},"f:status":{"f:conditions":{"k:{\"type\":\"DiskPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"MemoryPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"PIDPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"Ready\"}":{"f:lastHeartbeatTime":{},"f:message":{},"f:reason":{},"f:status":{}}},"f:images":{}}} status}]},Spec:NodeSpec{PodCIDR:10.64.1.0/24,DoNotUseExternalID:,ProviderID:gce://k8s-jkns-e2e-gke-ubuntu-slow/us-west1-b/bootstrap-e2e-minion-group-p1wq,Unschedulable:false,Taints:[]Taint{},ConfigSource:nil,PodCIDRs:[10.64.1.0/24],},Status:NodeStatus{Capacity:ResourceList{attachable-volumes-gce-pd: {{127 0} {<nil>} 127 DecimalSI},cpu: {{2 0} {<nil>} 2 DecimalSI},ephemeral-storage: {{101203873792 0} {<nil>} 98831908Ki BinarySI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{7815438336 0} {<nil>} BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Allocatable:ResourceList{attachable-volumes-gce-pd: {{127 0} {<nil>} 127 DecimalSI},cpu: {{2 0} {<nil>} 2 DecimalSI},ephemeral-storage: {{91083486262 0} {<nil>} 91083486262 DecimalSI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{7553294336 0} {<nil>} BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Phase:,Conditions:[]NodeCondition{NodeCondition{Type:FrequentDockerRestart,Status:False,LastHeartbeatTime:2022-11-26 18:50:20 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentDockerRestart,Message:docker is functioning properly,},NodeCondition{Type:FrequentContainerdRestart,Status:False,LastHeartbeatTime:2022-11-26 18:50:20 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentContainerdRestart,Message:containerd is functioning properly,},NodeCondition{Type:CorruptDockerOverlay2,Status:False,LastHeartbeatTime:2022-11-26 18:50:20 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoCorruptDockerOverlay2,Message:docker overlay2 is functioning properly,},NodeCondition{Type:KernelDeadlock,Status:False,LastHeartbeatTime:2022-11-26 18:50:20 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:KernelHasNoDeadlock,Message:kernel has no deadlock,},NodeCondition{Type:ReadonlyFilesystem,Status:False,LastHeartbeatTime:2022-11-26 18:50:20 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:FilesystemIsNotReadOnly,Message:Filesystem is not read-only,},NodeCondition{Type:FrequentUnregisterNetDevice,Status:False,LastHeartbeatTime:2022-11-26 18:50:20 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentUnregisterNetDevice,Message:node is functioning properly,},NodeCondition{Type:FrequentKubeletRestart,Status:False,LastHeartbeatTime:2022-11-26 18:50:20 +0000 UTC,LastTransitionTime:2022-11-26 18:39:36 +0000 UTC,Reason:NoFrequentKubeletRestart,Message:kubelet is functioning properly,},NodeCondition{Type:NetworkUnavailable,Status:False,LastHeartbeatTime:2022-11-26 18:39:46 +0000 UTC,LastTransitionTime:2022-11-26 18:39:46 +0000 UTC,Reason:RouteCreated,Message:RouteController created a route,},NodeCondition{Type:MemoryPressure,Status:False,LastHeartbeatTime:2022-11-26 18:48:55 +0000 UTC,LastTransitionTime:2022-11-26 18:39:32 +0000 UTC,Reason:KubeletHasSufficientMemory,Message:kubelet has sufficient memory available,},NodeCondition{Type:DiskPressure,Status:False,LastHeartbeatTime:2022-11-26 18:48:55 +0000 UTC,LastTransitionTime:2022-11-26 18:39:32 +0000 UTC,Reason:KubeletHasNoDiskPressure,Message:kubelet has no disk pressure,},NodeCondition{Type:PIDPressure,Status:False,LastHeartbeatTime:2022-11-26 18:48:55 +0000 UTC,LastTransitionTime:2022-11-26 18:39:32 +0000 UTC,Reason:KubeletHasSufficientPID,Message:kubelet has sufficient PID available,},NodeCondition{Type:Ready,Status:True,LastHeartbeatTime:2022-11-26 18:48:55 +0000 UTC,LastTransitionTime:2022-11-26 18:39:32 +0000 UTC,Reason:KubeletReady,Message:kubelet is posting ready status. AppArmor enabled,},},Addresses:[]NodeAddress{NodeAddress{Type:InternalIP,Address:10.138.0.3,},NodeAddress{Type:ExternalIP,Address:34.168.33.242,},NodeAddress{Type:InternalDNS,Address:bootstrap-e2e-minion-group-p1wq.c.k8s-jkns-e2e-gke-ubuntu-slow.internal,},NodeAddress{Type:Hostname,Address:bootstrap-e2e-minion-group-p1wq.c.k8s-jkns-e2e-gke-ubuntu-slow.internal,},},DaemonEndpoints:NodeDaemonEndpoints{KubeletEndpoint:DaemonEndpoint{Port:10250,},},NodeInfo:NodeSystemInfo{MachineID:af52744010a8d3f305ac13b550d67291,SystemUUID:af527440-10a8-d3f3-05ac-13b550d67291,BootID:aec04ea1-f168-4b94-9143-5f1e29a8fc63,KernelVersion:5.10.123+,OSImage:Container-Optimized OS from Google,ContainerRuntimeVersion:containerd://1.7.0-beta.0-149-gd06318622,KubeletVersion:v1.27.0-alpha.0.50+70617042976dc1,KubeProxyVersion:v1.27.0-alpha.0.50+70617042976dc1,OperatingSystem:linux,Architecture:amd64,},Images:[]ContainerImage{ContainerImage{Names:[registry.k8s.io/kube-proxy-amd64:v1.27.0-alpha.0.50_70617042976dc1],SizeBytes:67201736,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/agnhost@sha256:16bbf38c463a4223d8cfe4da12bc61010b082a79b4bb003e2d3ba3ece5dd5f9e registry.k8s.io/e2e-test-images/agnhost:2.43],SizeBytes:51706353,},ContainerImage{Names:[gke.gcr.io/prometheus-to-sd@sha256:e739643c3939ba0b161425f45a1989eedfc4a3b166db9a7100863296b4c70510 gke.gcr.io/prometheus-to-sd:v0.11.1-gke.1],SizeBytes:48742566,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-provisioner@sha256:ee3b525d5b89db99da3b8eb521d9cd90cb6e9ef0fbb651e98bb37be78d36b5b8 registry.k8s.io/sig-storage/csi-provisioner:v3.3.0],SizeBytes:25491225,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-resizer@sha256:425d8f1b769398127767b06ed97ce62578a3179bcb99809ce93a1649e025ffe7 registry.k8s.io/sig-storage/csi-resizer:v1.6.0],SizeBytes:24148884,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-snapshotter@sha256:291334908ddf71a4661fd7f6d9d97274de8a5378a2b6fdfeb2ce73414a34f82f registry.k8s.io/sig-storage/csi-snapshotter:v6.1.0],SizeBytes:23881995,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-attacher@sha256:9a685020911e2725ad019dbce6e4a5ab93d51e3d4557f115e64343345e05781b registry.k8s.io/sig-storage/csi-attacher:v4.0.0],SizeBytes:23847201,},ContainerImage{Names:[registry.k8s.io/sig-storage/hostpathplugin@sha256:92257881c1d6493cf18299a24af42330f891166560047902b8d431fb66b01af5 registry.k8s.io/sig-storage/hostpathplugin:v1.9.0],SizeBytes:18758628,},ContainerImage{Names:[registry.k8s.io/coredns/coredns@sha256:8e352a029d304ca7431c6507b56800636c321cb52289686a581ab70aaa8a2e2a registry.k8s.io/coredns/coredns:v1.9.3],SizeBytes:14837849,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-node-driver-registrar@sha256:0103eee7c35e3e0b5cd8cdca9850dc71c793cdeb6669d8be7a89440da2d06ae4 registry.k8s.io/sig-storage/csi-node-driver-registrar:v2.5.1],SizeBytes:9133109,},ContainerImage{Names:[registry.k8s.io/sig-storage/livenessprobe@sha256:933940f13b3ea0abc62e656c1aa5c5b47c04b15d71250413a6b821bd0c58b94e registry.k8s.io/sig-storage/livenessprobe:v2.7.0],SizeBytes:8688564,},ContainerImage{Names:[registry.k8s.io/kas-network-proxy/proxy-agent@sha256:48f2a4ec3e10553a81b8dd1c6fa5fe4bcc9617f78e71c1ca89c6921335e2d7da registry.k8s.io/kas-network-proxy/proxy-agent:v0.0.33],SizeBytes:8512162,},ContainerImage{Names:[registry.k8s.io/metadata-proxy@sha256:e914645f22e946bce5165737e1b244e0a296ad1f0f81a9531adc57af2780978a registry.k8s.io/metadata-proxy:v0.1.12],SizeBytes:5301657,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/busybox@sha256:c318242786b139d18676b1c09a0ad7f15fc17f8f16a5b2e625cd0dc8c9703daf registry.k8s.io/e2e-test-images/busybox:1.29-2],SizeBytes:732424,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/busybox@sha256:2e0f836850e09b8b7cc937681d6194537a09fbd5f6b9e08f4d646a85128e8937 registry.k8s.io/e2e-test-images/busybox:1.29-4],SizeBytes:731990,},ContainerImage{Names:[registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097 registry.k8s.io/pause:3.9],SizeBytes:321520,},ContainerImage{Names:[registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d registry.k8s.io/pause:3.8],SizeBytes:311286,},},VolumesInUse:[],VolumesAttached:[]AttachedVolume{},Config:nil,},} Nov 26 18:50:42.924: INFO: Logging kubelet events for node bootstrap-e2e-minion-group-p1wq Nov 26 18:50:42.968: INFO: Logging pods the kubelet thinks is on node bootstrap-e2e-minion-group-p1wq Nov 26 18:50:43.026: INFO: csi-hostpathplugin-0 started at 2022-11-26 18:44:06 +0000 UTC (0+7 container statuses recorded) Nov 26 18:50:43.026: INFO: Container csi-attacher ready: false, restart count 4 Nov 26 18:50:43.026: INFO: Container csi-provisioner ready: false, restart count 4 Nov 26 18:50:43.026: INFO: Container csi-resizer ready: false, restart count 4 Nov 26 18:50:43.026: INFO: Container csi-snapshotter ready: false, restart count 4 Nov 26 18:50:43.026: INFO: Container hostpath ready: false, restart count 4 Nov 26 18:50:43.026: INFO: Container liveness-probe ready: false, restart count 4 Nov 26 18:50:43.026: INFO: Container node-driver-registrar ready: false, restart count 4 Nov 26 18:50:43.026: INFO: csi-mockplugin-attacher-0 started at 2022-11-26 18:41:51 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:43.026: INFO: Container csi-attacher ready: true, restart count 4 Nov 26 18:50:43.026: INFO: affinity-lb-esipp-transition-m2r6p started at 2022-11-26 18:45:17 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:43.026: INFO: Container affinity-lb-esipp-transition ready: true, restart count 0 Nov 26 18:50:43.026: INFO: csi-hostpathplugin-0 started at 2022-11-26 18:46:57 +0000 UTC (0+7 container statuses recorded) Nov 26 18:50:43.026: INFO: Container csi-attacher ready: true, restart count 0 Nov 26 18:50:43.026: INFO: Container csi-provisioner ready: true, restart count 0 Nov 26 18:50:43.026: INFO: Container csi-resizer ready: true, restart count 0 Nov 26 18:50:43.026: INFO: Container csi-snapshotter ready: true, restart count 0 Nov 26 18:50:43.026: INFO: Container hostpath ready: true, restart count 0 Nov 26 18:50:43.026: INFO: Container liveness-probe ready: true, restart count 0 Nov 26 18:50:43.026: INFO: Container node-driver-registrar ready: true, restart count 0 Nov 26 18:50:43.026: INFO: metadata-proxy-v0.1-89kmg started at 2022-11-26 18:39:33 +0000 UTC (0+2 container statuses recorded) Nov 26 18:50:43.026: INFO: Container metadata-proxy ready: true, restart count 0 Nov 26 18:50:43.026: INFO: Container prometheus-to-sd-exporter ready: true, restart count 0 Nov 26 18:50:43.026: INFO: konnectivity-agent-dk4z2 started at 2022-11-26 18:39:46 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:43.026: INFO: Container konnectivity-agent ready: true, restart count 4 Nov 26 18:50:43.026: INFO: coredns-6d97d5ddb-xg7m7 started at 2022-11-26 18:39:51 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:43.026: INFO: Container coredns ready: false, restart count 5 Nov 26 18:50:43.026: INFO: csi-mockplugin-0 started at 2022-11-26 18:41:51 +0000 UTC (0+3 container statuses recorded) Nov 26 18:50:43.026: INFO: Container csi-provisioner ready: false, restart count 4 Nov 26 18:50:43.026: INFO: Container driver-registrar ready: false, restart count 4 Nov 26 18:50:43.026: INFO: Container mock ready: false, restart count 4 Nov 26 18:50:43.026: INFO: kube-proxy-bootstrap-e2e-minion-group-p1wq started at 2022-11-26 18:39:32 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:43.026: INFO: Container kube-proxy ready: false, restart count 5 Nov 26 18:50:43.026: INFO: csi-mockplugin-0 started at 2022-11-26 18:45:36 +0000 UTC (0+4 container statuses recorded) Nov 26 18:50:43.026: INFO: Container busybox ready: true, restart count 3 Nov 26 18:50:43.026: INFO: Container csi-provisioner ready: false, restart count 3 Nov 26 18:50:43.026: INFO: Container driver-registrar ready: true, restart count 4 Nov 26 18:50:43.026: INFO: Container mock ready: true, restart count 4 Nov 26 18:50:43.026: INFO: netserver-2 started at 2022-11-26 18:44:09 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:43.026: INFO: Container webserver ready: false, restart count 5 Nov 26 18:50:43.026: INFO: pod-subpath-test-inlinevolume-gshk started at 2022-11-26 18:49:03 +0000 UTC (1+2 container statuses recorded) Nov 26 18:50:43.026: INFO: Init container init-volume-inlinevolume-gshk ready: true, restart count 1 Nov 26 18:50:43.026: INFO: Container test-container-subpath-inlinevolume-gshk ready: true, restart count 1 Nov 26 18:50:43.026: INFO: Container test-container-volume-inlinevolume-gshk ready: true, restart count 1 Nov 26 18:50:43.026: INFO: netserver-2 started at 2022-11-26 18:46:43 +0000 UTC (0+1 container statuses recorded) Nov 26 18:50:43.026: INFO: Container webserver ready: true, restart count 4 Nov 26 18:50:43.026: INFO: csi-mockplugin-0 started at 2022-11-26 18:44:44 +0000 UTC (0+4 container statuses recorded) Nov 26 18:50:43.026: INFO: Container busybox ready: true, restart count 3 Nov 26 18:50:43.026: INFO: Container csi-provisioner ready: false, restart count 4 Nov 26 18:50:43.026: INFO: Container driver-registrar ready: true, restart count 5 Nov 26 18:50:43.026: INFO: Container mock ready: true, restart count 5 Nov 26 18:50:43.251: INFO: Latency metrics for node bootstrap-e2e-minion-group-p1wq [DeferCleanup (Each)] [sig-network] LoadBalancers ESIPP [Slow] tear down framework | framework.go:193 STEP: Destroying namespace "esipp-5938" for this suite. 11/26/22 18:50:43.251
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-network\]\sLoadBalancers\sESIPP\s\[Slow\]\sshould\swork\sfor\stype\=LoadBalancer$'
test/e2e/network/loadbalancer.go:1272 k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1272 +0xd8 There were additional failures detected after the initial failure: [FAILED] Nov 26 19:04:07.872: failed to list events in namespace "esipp-5381": Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/events": dial tcp 34.83.88.61:443: connect: connection refused In [DeferCleanup (Each)] at: test/e2e/framework/debug/dump.go:44 ---------- [FAILED] Nov 26 19:04:07.913: Couldn't delete ns: "esipp-5381": Delete "https://34.83.88.61/api/v1/namespaces/esipp-5381": dial tcp 34.83.88.61:443: connect: connection refused (&url.Error{Op:"Delete", URL:"https://34.83.88.61/api/v1/namespaces/esipp-5381", Err:(*net.OpError)(0xc001f0a370)}) In [DeferCleanup (Each)] at: test/e2e/framework/framework.go:370from junit_01.xml
[BeforeEach] [sig-network] LoadBalancers ESIPP [Slow] set up framework | framework.go:178 STEP: Creating a kubernetes client 11/26/22 18:49:06.564 Nov 26 18:49:06.564: INFO: >>> kubeConfig: /workspace/.kube/config STEP: Building a namespace api object, basename esipp 11/26/22 18:49:06.567 STEP: Waiting for a default service account to be provisioned in namespace 11/26/22 18:49:06.817 STEP: Waiting for kube-root-ca.crt to be provisioned in namespace 11/26/22 18:49:06.919 [BeforeEach] [sig-network] LoadBalancers ESIPP [Slow] test/e2e/framework/metrics/init/init.go:31 [BeforeEach] [sig-network] LoadBalancers ESIPP [Slow] test/e2e/network/loadbalancer.go:1250 [It] should work for type=LoadBalancer test/e2e/network/loadbalancer.go:1266 STEP: creating a service esipp-5381/external-local-lb with type=LoadBalancer 11/26/22 18:49:07.2 STEP: setting ExternalTrafficPolicy=Local 11/26/22 18:49:07.2 STEP: waiting for loadbalancer for service esipp-5381/external-local-lb 11/26/22 18:49:07.519 Nov 26 18:49:07.519: INFO: Waiting up to 15m0s for service "external-local-lb" to have a LoadBalancer Nov 26 18:49:11.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:13.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:15.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:17.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:19.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:21.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:23.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:25.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:27.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:29.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:31.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:33.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:35.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:37.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:39.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:41.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:43.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:45.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:47.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:49.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:51.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:53.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:55.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:57.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:59.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:50:01.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:50:03.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:50:05.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:50:07.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:52:49.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:52:51.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:52:53.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:52:55.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:52:57.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:52:59.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:01.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:03.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:05.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:07.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:09.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:11.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:13.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:15.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:17.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:19.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:21.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:23.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:25.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:27.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:29.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:31.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:33.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:35.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:37.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:39.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:41.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:43.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:45.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:47.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:49.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:51.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:53.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:55.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:57.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:59.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:01.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:03.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:05.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should work for type=LoadBalancer (Spec Runtime: 5m0.637s) test/e2e/network/loadbalancer.go:1266 In [It] (Node Runtime: 5m0.001s) test/e2e/network/loadbalancer.go:1266 At [By Step] waiting for loadbalancer for service esipp-5381/external-local-lb (Step Runtime: 4m59.682s) test/e2e/framework/service/jig.go:260 Spec Goroutine goroutine 1325 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0041d0c30, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x28?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc000182420?, 0xc004fcfb78?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00147cf60?, 0x7fa7740?, 0xc000116b80?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001f0aaf0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001f0aaf0, 0x41?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateLoadBalancerService(0xc001f0aaf0, 0x6aba880?, 0xc004fcfe28) test/e2e/framework/service/jig.go:261 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateOnlyLocalLoadBalancerService(0xc001f0aaf0, 0xc002cf84e0?, 0x1, 0xa?) test/e2e/framework/service/jig.go:222 > k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1271 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0038c42d0, 0xc001413880}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 18:54:07.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:09.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:11.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:13.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:15.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:17.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:19.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:21.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:23.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:25.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should work for type=LoadBalancer (Spec Runtime: 5m20.639s) test/e2e/network/loadbalancer.go:1266 In [It] (Node Runtime: 5m20.003s) test/e2e/network/loadbalancer.go:1266 At [By Step] waiting for loadbalancer for service esipp-5381/external-local-lb (Step Runtime: 5m19.684s) test/e2e/framework/service/jig.go:260 Spec Goroutine goroutine 1325 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0041d0c30, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x28?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc000182420?, 0xc004fcfb78?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00147cf60?, 0x7fa7740?, 0xc000116b80?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001f0aaf0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001f0aaf0, 0x41?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateLoadBalancerService(0xc001f0aaf0, 0x6aba880?, 0xc004fcfe28) test/e2e/framework/service/jig.go:261 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateOnlyLocalLoadBalancerService(0xc001f0aaf0, 0xc002cf84e0?, 0x1, 0xa?) test/e2e/framework/service/jig.go:222 > k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1271 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0038c42d0, 0xc001413880}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 18:54:27.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:29.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:31.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:33.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:35.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:37.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:39.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:41.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:43.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should work for type=LoadBalancer (Spec Runtime: 5m40.642s) test/e2e/network/loadbalancer.go:1266 In [It] (Node Runtime: 5m40.006s) test/e2e/network/loadbalancer.go:1266 At [By Step] waiting for loadbalancer for service esipp-5381/external-local-lb (Step Runtime: 5m39.687s) test/e2e/framework/service/jig.go:260 Spec Goroutine goroutine 1325 [select] net/http.(*Transport).getConn(0xc003f78000, 0xc0037a3c80, {{}, 0x0, {0xc000efef00, 0x5}, {0xc001205e60, 0xf}, 0x0}) /usr/local/go/src/net/http/transport.go:1376 net/http.(*Transport).roundTrip(0xc003f78000, 0xc000464700) /usr/local/go/src/net/http/transport.go:582 net/http.(*Transport).RoundTrip(0x6fe4b20?, 0xc004000030?) /usr/local/go/src/net/http/roundtrip.go:17 k8s.io/kubernetes/vendor/k8s.io/client-go/transport.(*bearerAuthRoundTripper).RoundTrip(0xc004ec2090, 0xc000464600) vendor/k8s.io/client-go/transport/round_trippers.go:317 k8s.io/kubernetes/vendor/k8s.io/client-go/transport.(*userAgentRoundTripper).RoundTrip(0xc004eaf400, 0xc000464500) vendor/k8s.io/client-go/transport/round_trippers.go:168 net/http.send(0xc000464500, {0x7fad100, 0xc004eaf400}, {0x74d54e0?, 0x1?, 0x0?}) /usr/local/go/src/net/http/client.go:251 net/http.(*Client).send(0xc004ec20c0, 0xc000464500, {0x7fdea5790108?, 0x100?, 0x0?}) /usr/local/go/src/net/http/client.go:175 net/http.(*Client).do(0xc004ec20c0, 0xc000464500) /usr/local/go/src/net/http/client.go:715 net/http.(*Client).Do(...) /usr/local/go/src/net/http/client.go:581 k8s.io/kubernetes/vendor/k8s.io/client-go/rest.(*Request).request(0xc000464300, {0x7fe0bc8, 0xc0001b0008}, 0x0?) vendor/k8s.io/client-go/rest/request.go:964 k8s.io/kubernetes/vendor/k8s.io/client-go/rest.(*Request).Do(0xc000464300, {0x7fe0bc8, 0xc0001b0008}) vendor/k8s.io/client-go/rest/request.go:1005 k8s.io/kubernetes/vendor/k8s.io/client-go/kubernetes/typed/core/v1.(*services).Get(0xc0012a6a80, {0x7fe0bc8, 0xc0001b0008}, {0x75ed637, 0x11}, {{{0x0, 0x0}, {0x0, 0x0}}, {0x0, ...}}) vendor/k8s.io/client-go/kubernetes/typed/core/v1/service.go:79 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition.func1() test/e2e/framework/service/jig.go:620 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.ConditionFunc.WithContext.func1({0x2742871, 0x0}) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:222 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.runConditionWithCrashProtectionWithContext({0x7fe0bc8?, 0xc0001b0000?}, 0xc0004fe268?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:235 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0041d0c30, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:662 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x28?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc000182420?, 0xc004fcfb78?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00147cf60?, 0x7fa7740?, 0xc000116b80?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001f0aaf0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001f0aaf0, 0x41?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateLoadBalancerService(0xc001f0aaf0, 0x6aba880?, 0xc004fcfe28) test/e2e/framework/service/jig.go:261 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateOnlyLocalLoadBalancerService(0xc001f0aaf0, 0xc002cf84e0?, 0x1, 0xa?) test/e2e/framework/service/jig.go:222 > k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1271 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0038c42d0, 0xc001413880}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 18:54:49.040: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused - error from a previous attempt: read tcp 10.60.55.171:41402->34.83.88.61:443: read: connection reset by peer Nov 26 18:54:49.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:51.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:53.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:55.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:57.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:59.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:55:01.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:55:03.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:55:05.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should work for type=LoadBalancer (Spec Runtime: 6m0.645s) test/e2e/network/loadbalancer.go:1266 In [It] (Node Runtime: 6m0.009s) test/e2e/network/loadbalancer.go:1266 At [By Step] waiting for loadbalancer for service esipp-5381/external-local-lb (Step Runtime: 5m59.69s) test/e2e/framework/service/jig.go:260 Spec Goroutine goroutine 1325 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0041d0c30, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x28?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc000182420?, 0xc004fcfb78?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00147cf60?, 0x7fa7740?, 0xc000116b80?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001f0aaf0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001f0aaf0, 0x41?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateLoadBalancerService(0xc001f0aaf0, 0x6aba880?, 0xc004fcfe28) test/e2e/framework/service/jig.go:261 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateOnlyLocalLoadBalancerService(0xc001f0aaf0, 0xc002cf84e0?, 0x1, 0xa?) test/e2e/framework/service/jig.go:222 > k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1271 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0038c42d0, 0xc001413880}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 18:55:07.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:55:09.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:55:11.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:55:13.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:55:15.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:55:17.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:55:19.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:55:21.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:55:23.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:55:25.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should work for type=LoadBalancer (Spec Runtime: 6m20.649s) test/e2e/network/loadbalancer.go:1266 In [It] (Node Runtime: 6m20.013s) test/e2e/network/loadbalancer.go:1266 At [By Step] waiting for loadbalancer for service esipp-5381/external-local-lb (Step Runtime: 6m19.694s) test/e2e/framework/service/jig.go:260 Spec Goroutine goroutine 1325 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0041d0c30, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x28?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc000182420?, 0xc004fcfb78?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00147cf60?, 0x7fa7740?, 0xc000116b80?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001f0aaf0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001f0aaf0, 0x41?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateLoadBalancerService(0xc001f0aaf0, 0x6aba880?, 0xc004fcfe28) test/e2e/framework/service/jig.go:261 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateOnlyLocalLoadBalancerService(0xc001f0aaf0, 0xc002cf84e0?, 0x1, 0xa?) test/e2e/framework/service/jig.go:222 > k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1271 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0038c42d0, 0xc001413880}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 18:55:27.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:55:29.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:55:31.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:55:33.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:55:35.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:55:37.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:55:39.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:55:41.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:55:43.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:55:45.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should work for type=LoadBalancer (Spec Runtime: 6m40.655s) test/e2e/network/loadbalancer.go:1266 In [It] (Node Runtime: 6m40.019s) test/e2e/network/loadbalancer.go:1266 At [By Step] waiting for loadbalancer for service esipp-5381/external-local-lb (Step Runtime: 6m39.7s) test/e2e/framework/service/jig.go:260 Spec Goroutine goroutine 1325 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0041d0c30, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x28?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc000182420?, 0xc004fcfb78?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00147cf60?, 0x7fa7740?, 0xc000116b80?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001f0aaf0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001f0aaf0, 0x41?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateLoadBalancerService(0xc001f0aaf0, 0x6aba880?, 0xc004fcfe28) test/e2e/framework/service/jig.go:261 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateOnlyLocalLoadBalancerService(0xc001f0aaf0, 0xc002cf84e0?, 0x1, 0xa?) test/e2e/framework/service/jig.go:222 > k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1271 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0038c42d0, 0xc001413880}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 18:55:47.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:55:49.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:55:51.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:55:53.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:55:55.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:55:57.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:55:59.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:01.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:03.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:05.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should work for type=LoadBalancer (Spec Runtime: 7m0.657s) test/e2e/network/loadbalancer.go:1266 In [It] (Node Runtime: 7m0.02s) test/e2e/network/loadbalancer.go:1266 At [By Step] waiting for loadbalancer for service esipp-5381/external-local-lb (Step Runtime: 6m59.702s) test/e2e/framework/service/jig.go:260 Spec Goroutine goroutine 1325 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0041d0c30, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x28?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc000182420?, 0xc004fcfb78?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00147cf60?, 0x7fa7740?, 0xc000116b80?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001f0aaf0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001f0aaf0, 0x41?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateLoadBalancerService(0xc001f0aaf0, 0x6aba880?, 0xc004fcfe28) test/e2e/framework/service/jig.go:261 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateOnlyLocalLoadBalancerService(0xc001f0aaf0, 0xc002cf84e0?, 0x1, 0xa?) test/e2e/framework/service/jig.go:222 > k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1271 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0038c42d0, 0xc001413880}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 18:56:07.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:09.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:11.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:13.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:15.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:17.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:19.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:21.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:23.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:25.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should work for type=LoadBalancer (Spec Runtime: 7m20.659s) test/e2e/network/loadbalancer.go:1266 In [It] (Node Runtime: 7m20.023s) test/e2e/network/loadbalancer.go:1266 At [By Step] waiting for loadbalancer for service esipp-5381/external-local-lb (Step Runtime: 7m19.704s) test/e2e/framework/service/jig.go:260 Spec Goroutine goroutine 1325 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0041d0c30, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x28?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc000182420?, 0xc004fcfb78?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00147cf60?, 0x7fa7740?, 0xc000116b80?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001f0aaf0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001f0aaf0, 0x41?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateLoadBalancerService(0xc001f0aaf0, 0x6aba880?, 0xc004fcfe28) test/e2e/framework/service/jig.go:261 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateOnlyLocalLoadBalancerService(0xc001f0aaf0, 0xc002cf84e0?, 0x1, 0xa?) test/e2e/framework/service/jig.go:222 > k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1271 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0038c42d0, 0xc001413880}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 18:56:27.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:29.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:31.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:33.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:35.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:37.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:39.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:41.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:43.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:45.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should work for type=LoadBalancer (Spec Runtime: 7m40.662s) test/e2e/network/loadbalancer.go:1266 In [It] (Node Runtime: 7m40.025s) test/e2e/network/loadbalancer.go:1266 At [By Step] waiting for loadbalancer for service esipp-5381/external-local-lb (Step Runtime: 7m39.707s) test/e2e/framework/service/jig.go:260 Spec Goroutine goroutine 1325 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0041d0c30, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x28?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc000182420?, 0xc004fcfb78?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00147cf60?, 0x7fa7740?, 0xc000116b80?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001f0aaf0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001f0aaf0, 0x41?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateLoadBalancerService(0xc001f0aaf0, 0x6aba880?, 0xc004fcfe28) test/e2e/framework/service/jig.go:261 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateOnlyLocalLoadBalancerService(0xc001f0aaf0, 0xc002cf84e0?, 0x1, 0xa?) test/e2e/framework/service/jig.go:222 > k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1271 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0038c42d0, 0xc001413880}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 18:56:47.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:49.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:51.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:53.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:55.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:57.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:59.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:57:01.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:57:03.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:57:05.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should work for type=LoadBalancer (Spec Runtime: 8m0.663s) test/e2e/network/loadbalancer.go:1266 In [It] (Node Runtime: 8m0.027s) test/e2e/network/loadbalancer.go:1266 At [By Step] waiting for loadbalancer for service esipp-5381/external-local-lb (Step Runtime: 7m59.709s) test/e2e/framework/service/jig.go:260 Spec Goroutine goroutine 1325 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0041d0c30, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x28?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc000182420?, 0xc004fcfb78?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00147cf60?, 0x7fa7740?, 0xc000116b80?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001f0aaf0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001f0aaf0, 0x41?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateLoadBalancerService(0xc001f0aaf0, 0x6aba880?, 0xc004fcfe28) test/e2e/framework/service/jig.go:261 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateOnlyLocalLoadBalancerService(0xc001f0aaf0, 0xc002cf84e0?, 0x1, 0xa?) test/e2e/framework/service/jig.go:222 > k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1271 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0038c42d0, 0xc001413880}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 18:57:07.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:57:09.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:57:11.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:57:13.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:57:15.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:57:17.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:57:19.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:57:21.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:57:23.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:57:25.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should work for type=LoadBalancer (Spec Runtime: 8m20.666s) test/e2e/network/loadbalancer.go:1266 In [It] (Node Runtime: 8m20.03s) test/e2e/network/loadbalancer.go:1266 At [By Step] waiting for loadbalancer for service esipp-5381/external-local-lb (Step Runtime: 8m19.711s) test/e2e/framework/service/jig.go:260 Spec Goroutine goroutine 1325 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0041d0c30, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x28?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc000182420?, 0xc004fcfb78?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00147cf60?, 0x7fa7740?, 0xc000116b80?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001f0aaf0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001f0aaf0, 0x41?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateLoadBalancerService(0xc001f0aaf0, 0x6aba880?, 0xc004fcfe28) test/e2e/framework/service/jig.go:261 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateOnlyLocalLoadBalancerService(0xc001f0aaf0, 0xc002cf84e0?, 0x1, 0xa?) test/e2e/framework/service/jig.go:222 > k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1271 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0038c42d0, 0xc001413880}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 18:57:27.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:57:29.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:57:31.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:57:33.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:57:35.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should work for type=LoadBalancer (Spec Runtime: 8m40.669s) test/e2e/network/loadbalancer.go:1266 In [It] (Node Runtime: 8m40.032s) test/e2e/network/loadbalancer.go:1266 At [By Step] waiting for loadbalancer for service esipp-5381/external-local-lb (Step Runtime: 8m39.714s) test/e2e/framework/service/jig.go:260 Spec Goroutine goroutine 1325 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0041d0c30, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x28?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc000182420?, 0xc004fcfb78?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00147cf60?, 0x7fa7740?, 0xc000116b80?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001f0aaf0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001f0aaf0, 0x41?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateLoadBalancerService(0xc001f0aaf0, 0x6aba880?, 0xc004fcfe28) test/e2e/framework/service/jig.go:261 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateOnlyLocalLoadBalancerService(0xc001f0aaf0, 0xc002cf84e0?, 0x1, 0xa?) test/e2e/framework/service/jig.go:222 > k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1271 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0038c42d0, 0xc001413880}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should work for type=LoadBalancer (Spec Runtime: 9m0.671s) test/e2e/network/loadbalancer.go:1266 In [It] (Node Runtime: 9m0.035s) test/e2e/network/loadbalancer.go:1266 At [By Step] waiting for loadbalancer for service esipp-5381/external-local-lb (Step Runtime: 8m59.716s) test/e2e/framework/service/jig.go:260 Spec Goroutine goroutine 1325 [select] k8s.io/kubernetes/vendor/golang.org/x/net/http2.(*ClientConn).RoundTrip(0xc000590d80, 0xc004ec6b00) vendor/golang.org/x/net/http2/transport.go:1200 k8s.io/kubernetes/vendor/golang.org/x/net/http2.(*Transport).RoundTripOpt(0xc003961e00, 0xc004ec6b00, {0xe0?}) vendor/golang.org/x/net/http2/transport.go:519 k8s.io/kubernetes/vendor/golang.org/x/net/http2.(*Transport).RoundTrip(...) vendor/golang.org/x/net/http2/transport.go:480 k8s.io/kubernetes/vendor/golang.org/x/net/http2.noDialH2RoundTripper.RoundTrip({0xc003f78000?}, 0xc004ec6b00?) vendor/golang.org/x/net/http2/transport.go:3020 net/http.(*Transport).roundTrip(0xc003f78000, 0xc004ec6b00) /usr/local/go/src/net/http/transport.go:540 net/http.(*Transport).RoundTrip(0x6fe4b20?, 0xc002bdd110?) /usr/local/go/src/net/http/roundtrip.go:17 k8s.io/kubernetes/vendor/k8s.io/client-go/transport.(*bearerAuthRoundTripper).RoundTrip(0xc004ec2090, 0xc004ec6a00) vendor/k8s.io/client-go/transport/round_trippers.go:317 k8s.io/kubernetes/vendor/k8s.io/client-go/transport.(*userAgentRoundTripper).RoundTrip(0xc004eaf400, 0xc004ec6900) vendor/k8s.io/client-go/transport/round_trippers.go:168 net/http.send(0xc004ec6900, {0x7fad100, 0xc004eaf400}, {0x74d54e0?, 0x1?, 0x0?}) /usr/local/go/src/net/http/client.go:251 net/http.(*Client).send(0xc004ec20c0, 0xc004ec6900, {0x7fdea5790a68?, 0x100?, 0x0?}) /usr/local/go/src/net/http/client.go:175 net/http.(*Client).do(0xc004ec20c0, 0xc004ec6900) /usr/local/go/src/net/http/client.go:715 net/http.(*Client).Do(...) /usr/local/go/src/net/http/client.go:581 k8s.io/kubernetes/vendor/k8s.io/client-go/rest.(*Request).request(0xc004ec6500, {0x7fe0bc8, 0xc0001b0008}, 0x0?) vendor/k8s.io/client-go/rest/request.go:964 k8s.io/kubernetes/vendor/k8s.io/client-go/rest.(*Request).Do(0xc004ec6500, {0x7fe0bc8, 0xc0001b0008}) vendor/k8s.io/client-go/rest/request.go:1005 k8s.io/kubernetes/vendor/k8s.io/client-go/kubernetes/typed/core/v1.(*services).Get(0xc003f12200, {0x7fe0bc8, 0xc0001b0008}, {0x75ed637, 0x11}, {{{0x0, 0x0}, {0x0, 0x0}}, {0x0, ...}}) vendor/k8s.io/client-go/kubernetes/typed/core/v1/service.go:79 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition.func1() test/e2e/framework/service/jig.go:620 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.ConditionFunc.WithContext.func1({0x2742871, 0x0}) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:222 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.runConditionWithCrashProtectionWithContext({0x7fe0bc8?, 0xc0001b0000?}, 0xc0004fe268?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:235 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0041d0c30, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:662 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x28?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc000182420?, 0xc004fcfb78?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00147cf60?, 0x7fa7740?, 0xc000116b80?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001f0aaf0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001f0aaf0, 0x41?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateLoadBalancerService(0xc001f0aaf0, 0x6aba880?, 0xc004fcfe28) test/e2e/framework/service/jig.go:261 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateOnlyLocalLoadBalancerService(0xc001f0aaf0, 0xc002cf84e0?, 0x1, 0xa?) test/e2e/framework/service/jig.go:222 > k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1271 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0038c42d0, 0xc001413880}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should work for type=LoadBalancer (Spec Runtime: 9m20.675s) test/e2e/network/loadbalancer.go:1266 In [It] (Node Runtime: 9m20.039s) test/e2e/network/loadbalancer.go:1266 At [By Step] waiting for loadbalancer for service esipp-5381/external-local-lb (Step Runtime: 9m19.72s) test/e2e/framework/service/jig.go:260 Spec Goroutine goroutine 1325 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0041d0c30, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x28?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc000182420?, 0xc004fcfb78?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00147cf60?, 0x7fa7740?, 0xc000116b80?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001f0aaf0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001f0aaf0, 0x41?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateLoadBalancerService(0xc001f0aaf0, 0x6aba880?, 0xc004fcfe28) test/e2e/framework/service/jig.go:261 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateOnlyLocalLoadBalancerService(0xc001f0aaf0, 0xc002cf84e0?, 0x1, 0xa?) test/e2e/framework/service/jig.go:222 > k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1271 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0038c42d0, 0xc001413880}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should work for type=LoadBalancer (Spec Runtime: 9m40.682s) test/e2e/network/loadbalancer.go:1266 In [It] (Node Runtime: 9m40.046s) test/e2e/network/loadbalancer.go:1266 At [By Step] waiting for loadbalancer for service esipp-5381/external-local-lb (Step Runtime: 9m39.727s) test/e2e/framework/service/jig.go:260 Spec Goroutine goroutine 1325 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0041d0c30, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x28?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc000182420?, 0xc004fcfb78?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00147cf60?, 0x7fa7740?, 0xc000116b80?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001f0aaf0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001f0aaf0, 0x41?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateLoadBalancerService(0xc001f0aaf0, 0x6aba880?, 0xc004fcfe28) test/e2e/framework/service/jig.go:261 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateOnlyLocalLoadBalancerService(0xc001f0aaf0, 0xc002cf84e0?, 0x1, 0xa?) test/e2e/framework/service/jig.go:222 > k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1271 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0038c42d0, 0xc001413880}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should work for type=LoadBalancer (Spec Runtime: 10m0.685s) test/e2e/network/loadbalancer.go:1266 In [It] (Node Runtime: 10m0.049s) test/e2e/network/loadbalancer.go:1266 At [By Step] waiting for loadbalancer for service esipp-5381/external-local-lb (Step Runtime: 9m59.73s) test/e2e/framework/service/jig.go:260 Spec Goroutine goroutine 1325 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0041d0c30, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x28?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc000182420?, 0xc004fcfb78?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00147cf60?, 0x7fa7740?, 0xc000116b80?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001f0aaf0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001f0aaf0, 0x41?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateLoadBalancerService(0xc001f0aaf0, 0x6aba880?, 0xc004fcfe28) test/e2e/framework/service/jig.go:261 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateOnlyLocalLoadBalancerService(0xc001f0aaf0, 0xc002cf84e0?, 0x1, 0xa?) test/e2e/framework/service/jig.go:222 > k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1271 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0038c42d0, 0xc001413880}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should work for type=LoadBalancer (Spec Runtime: 10m20.687s) test/e2e/network/loadbalancer.go:1266 In [It] (Node Runtime: 10m20.051s) test/e2e/network/loadbalancer.go:1266 At [By Step] waiting for loadbalancer for service esipp-5381/external-local-lb (Step Runtime: 10m19.732s) test/e2e/framework/service/jig.go:260 Spec Goroutine goroutine 1325 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0041d0c30, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x28?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc000182420?, 0xc004fcfb78?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00147cf60?, 0x7fa7740?, 0xc000116b80?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001f0aaf0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001f0aaf0, 0x41?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateLoadBalancerService(0xc001f0aaf0, 0x6aba880?, 0xc004fcfe28) test/e2e/framework/service/jig.go:261 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateOnlyLocalLoadBalancerService(0xc001f0aaf0, 0xc002cf84e0?, 0x1, 0xa?) test/e2e/framework/service/jig.go:222 > k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1271 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0038c42d0, 0xc001413880}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should work for type=LoadBalancer (Spec Runtime: 10m40.689s) test/e2e/network/loadbalancer.go:1266 In [It] (Node Runtime: 10m40.053s) test/e2e/network/loadbalancer.go:1266 At [By Step] waiting for loadbalancer for service esipp-5381/external-local-lb (Step Runtime: 10m39.734s) test/e2e/framework/service/jig.go:260 Spec Goroutine goroutine 1325 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0041d0c30, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x28?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc000182420?, 0xc004fcfb78?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00147cf60?, 0x7fa7740?, 0xc000116b80?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001f0aaf0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001f0aaf0, 0x41?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateLoadBalancerService(0xc001f0aaf0, 0x6aba880?, 0xc004fcfe28) test/e2e/framework/service/jig.go:261 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateOnlyLocalLoadBalancerService(0xc001f0aaf0, 0xc002cf84e0?, 0x1, 0xa?) test/e2e/framework/service/jig.go:222 > k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1271 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0038c42d0, 0xc001413880}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 18:59:53.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:59:55.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:59:57.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:59:59.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:00:01.641: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:00:03.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:00:05.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should work for type=LoadBalancer (Spec Runtime: 11m0.692s) test/e2e/network/loadbalancer.go:1266 In [It] (Node Runtime: 11m0.056s) test/e2e/network/loadbalancer.go:1266 At [By Step] waiting for loadbalancer for service esipp-5381/external-local-lb (Step Runtime: 10m59.737s) test/e2e/framework/service/jig.go:260 Spec Goroutine goroutine 1325 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0041d0c30, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x28?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc000182420?, 0xc004fcfb78?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00147cf60?, 0x7fa7740?, 0xc000116b80?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001f0aaf0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001f0aaf0, 0x41?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateLoadBalancerService(0xc001f0aaf0, 0x6aba880?, 0xc004fcfe28) test/e2e/framework/service/jig.go:261 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateOnlyLocalLoadBalancerService(0xc001f0aaf0, 0xc002cf84e0?, 0x1, 0xa?) test/e2e/framework/service/jig.go:222 > k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1271 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0038c42d0, 0xc001413880}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 19:00:07.641: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:00:09.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:00:11.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:00:13.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:00:15.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:00:17.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:00:19.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:00:21.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:00:23.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:00:25.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should work for type=LoadBalancer (Spec Runtime: 11m20.694s) test/e2e/network/loadbalancer.go:1266 In [It] (Node Runtime: 11m20.058s) test/e2e/network/loadbalancer.go:1266 At [By Step] waiting for loadbalancer for service esipp-5381/external-local-lb (Step Runtime: 11m19.739s) test/e2e/framework/service/jig.go:260 Spec Goroutine goroutine 1325 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0041d0c30, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x28?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc000182420?, 0xc004fcfb78?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00147cf60?, 0x7fa7740?, 0xc000116b80?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001f0aaf0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001f0aaf0, 0x41?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateLoadBalancerService(0xc001f0aaf0, 0x6aba880?, 0xc004fcfe28) test/e2e/framework/service/jig.go:261 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateOnlyLocalLoadBalancerService(0xc001f0aaf0, 0xc002cf84e0?, 0x1, 0xa?) test/e2e/framework/service/jig.go:222 > k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1271 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0038c42d0, 0xc001413880}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 19:00:27.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:00:29.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:00:31.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:00:33.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:00:35.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:00:37.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:00:39.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:00:41.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:00:43.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:00:45.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should work for type=LoadBalancer (Spec Runtime: 11m40.698s) test/e2e/network/loadbalancer.go:1266 In [It] (Node Runtime: 11m40.061s) test/e2e/network/loadbalancer.go:1266 At [By Step] waiting for loadbalancer for service esipp-5381/external-local-lb (Step Runtime: 11m39.743s) test/e2e/framework/service/jig.go:260 Spec Goroutine goroutine 1325 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0041d0c30, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x28?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc000182420?, 0xc004fcfb78?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00147cf60?, 0x7fa7740?, 0xc000116b80?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001f0aaf0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001f0aaf0, 0x41?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateLoadBalancerService(0xc001f0aaf0, 0x6aba880?, 0xc004fcfe28) test/e2e/framework/service/jig.go:261 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateOnlyLocalLoadBalancerService(0xc001f0aaf0, 0xc002cf84e0?, 0x1, 0xa?) test/e2e/framework/service/jig.go:222 > k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1271 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0038c42d0, 0xc001413880}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 19:00:47.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:00:49.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:00:51.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:00:53.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:00:55.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:00:57.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:00:59.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:01:01.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:01:03.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:01:05.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should work for type=LoadBalancer (Spec Runtime: 12m0.701s) test/e2e/network/loadbalancer.go:1266 In [It] (Node Runtime: 12m0.065s) test/e2e/network/loadbalancer.go:1266 At [By Step] waiting for loadbalancer for service esipp-5381/external-local-lb (Step Runtime: 11m59.746s) test/e2e/framework/service/jig.go:260 Spec Goroutine goroutine 1325 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0041d0c30, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x28?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc000182420?, 0xc004fcfb78?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00147cf60?, 0x7fa7740?, 0xc000116b80?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001f0aaf0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001f0aaf0, 0x41?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateLoadBalancerService(0xc001f0aaf0, 0x6aba880?, 0xc004fcfe28) test/e2e/framework/service/jig.go:261 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateOnlyLocalLoadBalancerService(0xc001f0aaf0, 0xc002cf84e0?, 0x1, 0xa?) test/e2e/framework/service/jig.go:222 > k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1271 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0038c42d0, 0xc001413880}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 19:01:07.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:01:09.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:01:11.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:01:13.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:01:15.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:01:17.641: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:01:19.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:01:21.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:01:23.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:01:25.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should work for type=LoadBalancer (Spec Runtime: 12m20.703s) test/e2e/network/loadbalancer.go:1266 In [It] (Node Runtime: 12m20.067s) test/e2e/network/loadbalancer.go:1266 At [By Step] waiting for loadbalancer for service esipp-5381/external-local-lb (Step Runtime: 12m19.749s) test/e2e/framework/service/jig.go:260 Spec Goroutine goroutine 1325 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0041d0c30, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x28?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc000182420?, 0xc004fcfb78?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00147cf60?, 0x7fa7740?, 0xc000116b80?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001f0aaf0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001f0aaf0, 0x41?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateLoadBalancerService(0xc001f0aaf0, 0x6aba880?, 0xc004fcfe28) test/e2e/framework/service/jig.go:261 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateOnlyLocalLoadBalancerService(0xc001f0aaf0, 0xc002cf84e0?, 0x1, 0xa?) test/e2e/framework/service/jig.go:222 > k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1271 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0038c42d0, 0xc001413880}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 19:01:27.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:01:29.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:01:31.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:01:33.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:01:35.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:01:37.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:01:39.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:01:41.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:01:43.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:01:45.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should work for type=LoadBalancer (Spec Runtime: 12m40.706s) test/e2e/network/loadbalancer.go:1266 In [It] (Node Runtime: 12m40.07s) test/e2e/network/loadbalancer.go:1266 At [By Step] waiting for loadbalancer for service esipp-5381/external-local-lb (Step Runtime: 12m39.751s) test/e2e/framework/service/jig.go:260 Spec Goroutine goroutine 1325 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0041d0c30, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x28?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc000182420?, 0xc004fcfb78?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00147cf60?, 0x7fa7740?, 0xc000116b80?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001f0aaf0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001f0aaf0, 0x41?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateLoadBalancerService(0xc001f0aaf0, 0x6aba880?, 0xc004fcfe28) test/e2e/framework/service/jig.go:261 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateOnlyLocalLoadBalancerService(0xc001f0aaf0, 0xc002cf84e0?, 0x1, 0xa?) test/e2e/framework/service/jig.go:222 > k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1271 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0038c42d0, 0xc001413880}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 19:01:47.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:01:49.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:01:51.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:01:53.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:01:55.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:01:57.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:01:59.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:02:01.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:02:03.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:02:05.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should work for type=LoadBalancer (Spec Runtime: 13m0.708s) test/e2e/network/loadbalancer.go:1266 In [It] (Node Runtime: 13m0.072s) test/e2e/network/loadbalancer.go:1266 At [By Step] waiting for loadbalancer for service esipp-5381/external-local-lb (Step Runtime: 12m59.753s) test/e2e/framework/service/jig.go:260 Spec Goroutine goroutine 1325 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0041d0c30, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x28?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc000182420?, 0xc004fcfb78?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00147cf60?, 0x7fa7740?, 0xc000116b80?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001f0aaf0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001f0aaf0, 0x41?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateLoadBalancerService(0xc001f0aaf0, 0x6aba880?, 0xc004fcfe28) test/e2e/framework/service/jig.go:261 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateOnlyLocalLoadBalancerService(0xc001f0aaf0, 0xc002cf84e0?, 0x1, 0xa?) test/e2e/framework/service/jig.go:222 > k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1271 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0038c42d0, 0xc001413880}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 19:02:07.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:02:09.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:02:11.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:02:13.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:02:15.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:02:17.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:02:19.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:02:21.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:02:23.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:02:25.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should work for type=LoadBalancer (Spec Runtime: 13m20.71s) test/e2e/network/loadbalancer.go:1266 In [It] (Node Runtime: 13m20.074s) test/e2e/network/loadbalancer.go:1266 At [By Step] waiting for loadbalancer for service esipp-5381/external-local-lb (Step Runtime: 13m19.755s) test/e2e/framework/service/jig.go:260 Spec Goroutine goroutine 1325 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0041d0c30, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x28?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc000182420?, 0xc004fcfb78?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00147cf60?, 0x7fa7740?, 0xc000116b80?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001f0aaf0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001f0aaf0, 0x41?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateLoadBalancerService(0xc001f0aaf0, 0x6aba880?, 0xc004fcfe28) test/e2e/framework/service/jig.go:261 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateOnlyLocalLoadBalancerService(0xc001f0aaf0, 0xc002cf84e0?, 0x1, 0xa?) test/e2e/framework/service/jig.go:222 > k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1271 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0038c42d0, 0xc001413880}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 19:02:27.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:02:29.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:02:31.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:02:33.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:02:35.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:02:37.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:02:39.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:02:41.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:02:43.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:02:45.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should work for type=LoadBalancer (Spec Runtime: 13m40.712s) test/e2e/network/loadbalancer.go:1266 In [It] (Node Runtime: 13m40.076s) test/e2e/network/loadbalancer.go:1266 At [By Step] waiting for loadbalancer for service esipp-5381/external-local-lb (Step Runtime: 13m39.757s) test/e2e/framework/service/jig.go:260 Spec Goroutine goroutine 1325 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0041d0c30, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x28?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc000182420?, 0xc004fcfb78?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00147cf60?, 0x7fa7740?, 0xc000116b80?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001f0aaf0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001f0aaf0, 0x41?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateLoadBalancerService(0xc001f0aaf0, 0x6aba880?, 0xc004fcfe28) test/e2e/framework/service/jig.go:261 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateOnlyLocalLoadBalancerService(0xc001f0aaf0, 0xc002cf84e0?, 0x1, 0xa?) test/e2e/framework/service/jig.go:222 > k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1271 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0038c42d0, 0xc001413880}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 19:02:47.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:02:49.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:02:51.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:02:53.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:02:55.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:02:57.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:02:59.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:03:01.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:03:03.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:03:05.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should work for type=LoadBalancer (Spec Runtime: 14m0.714s) test/e2e/network/loadbalancer.go:1266 In [It] (Node Runtime: 14m0.078s) test/e2e/network/loadbalancer.go:1266 At [By Step] waiting for loadbalancer for service esipp-5381/external-local-lb (Step Runtime: 13m59.759s) test/e2e/framework/service/jig.go:260 Spec Goroutine goroutine 1325 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0041d0c30, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x28?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc000182420?, 0xc004fcfb78?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00147cf60?, 0x7fa7740?, 0xc000116b80?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001f0aaf0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001f0aaf0, 0x41?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateLoadBalancerService(0xc001f0aaf0, 0x6aba880?, 0xc004fcfe28) test/e2e/framework/service/jig.go:261 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateOnlyLocalLoadBalancerService(0xc001f0aaf0, 0xc002cf84e0?, 0x1, 0xa?) test/e2e/framework/service/jig.go:222 > k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1271 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0038c42d0, 0xc001413880}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 19:03:07.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:03:09.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:03:11.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:03:13.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:03:15.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:03:17.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:03:19.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:03:21.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:03:23.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:03:25.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should work for type=LoadBalancer (Spec Runtime: 14m20.716s) test/e2e/network/loadbalancer.go:1266 In [It] (Node Runtime: 14m20.08s) test/e2e/network/loadbalancer.go:1266 At [By Step] waiting for loadbalancer for service esipp-5381/external-local-lb (Step Runtime: 14m19.761s) test/e2e/framework/service/jig.go:260 Spec Goroutine goroutine 1325 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0041d0c30, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x28?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc000182420?, 0xc004fcfb78?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00147cf60?, 0x7fa7740?, 0xc000116b80?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001f0aaf0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001f0aaf0, 0x41?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateLoadBalancerService(0xc001f0aaf0, 0x6aba880?, 0xc004fcfe28) test/e2e/framework/service/jig.go:261 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateOnlyLocalLoadBalancerService(0xc001f0aaf0, 0xc002cf84e0?, 0x1, 0xa?) test/e2e/framework/service/jig.go:222 > k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1271 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0038c42d0, 0xc001413880}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 19:03:27.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:03:29.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:03:31.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:03:33.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:03:35.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:03:37.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:03:39.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:03:41.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:03:43.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:03:45.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should work for type=LoadBalancer (Spec Runtime: 14m40.719s) test/e2e/network/loadbalancer.go:1266 In [It] (Node Runtime: 14m40.083s) test/e2e/network/loadbalancer.go:1266 At [By Step] waiting for loadbalancer for service esipp-5381/external-local-lb (Step Runtime: 14m39.764s) test/e2e/framework/service/jig.go:260 Spec Goroutine goroutine 1325 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0041d0c30, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x28?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc000182420?, 0xc004fcfb78?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00147cf60?, 0x7fa7740?, 0xc000116b80?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001f0aaf0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001f0aaf0, 0x41?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateLoadBalancerService(0xc001f0aaf0, 0x6aba880?, 0xc004fcfe28) test/e2e/framework/service/jig.go:261 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateOnlyLocalLoadBalancerService(0xc001f0aaf0, 0xc002cf84e0?, 0x1, 0xa?) test/e2e/framework/service/jig.go:222 > k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1271 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0038c42d0, 0xc001413880}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 19:03:47.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:03:49.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:03:51.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:03:53.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:03:55.639: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:03:57.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:03:59.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:04:01.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:04:03.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:04:05.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #1 Automatically polling progress: [sig-network] LoadBalancers ESIPP [Slow] should work for type=LoadBalancer (Spec Runtime: 15m0.721s) test/e2e/network/loadbalancer.go:1266 In [It] (Node Runtime: 15m0.085s) test/e2e/network/loadbalancer.go:1266 At [By Step] waiting for loadbalancer for service esipp-5381/external-local-lb (Step Runtime: 14m59.766s) test/e2e/framework/service/jig.go:260 Spec Goroutine goroutine 1325 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc0041d0c30, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0001b0000}, 0x28?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0001b0000}, 0xc000182420?, 0xc004fcfb78?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0xc00147cf60?, 0x7fa7740?, 0xc000116b80?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001f0aaf0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001f0aaf0, 0x41?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateLoadBalancerService(0xc001f0aaf0, 0x6aba880?, 0xc004fcfe28) test/e2e/framework/service/jig.go:261 > k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).CreateOnlyLocalLoadBalancerService(0xc001f0aaf0, 0xc002cf84e0?, 0x1, 0xa?) test/e2e/framework/service/jig.go:222 > k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1271 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0xc0038c42d0, 0xc001413880}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 19:04:07.640: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:04:07.679: INFO: Retrying .... error trying to get Service external-local-lb: Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/services/external-local-lb": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 19:04:07.679: INFO: Unexpected error: <*fmt.wrapError | 0xc0040510a0>: { msg: "timed out waiting for service \"external-local-lb\" to have a load balancer: timed out waiting for the condition", err: <*errors.errorString | 0xc000113c60>{ s: "timed out waiting for the condition", }, } Nov 26 19:04:07.679: FAIL: timed out waiting for service "external-local-lb" to have a load balancer: timed out waiting for the condition Full Stack Trace k8s.io/kubernetes/test/e2e/network.glob..func20.3() test/e2e/network/loadbalancer.go:1272 +0xd8 [AfterEach] [sig-network] LoadBalancers ESIPP [Slow] test/e2e/framework/node/init/init.go:32 Nov 26 19:04:07.680: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready [AfterEach] [sig-network] LoadBalancers ESIPP [Slow] test/e2e/network/loadbalancer.go:1260 Nov 26 19:04:07.719: INFO: Output of kubectl describe svc: Nov 26 19:04:07.719: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=esipp-5381 describe svc --namespace=esipp-5381' Nov 26 19:04:07.831: INFO: rc: 1 Nov 26 19:04:07.831: INFO: [DeferCleanup (Each)] [sig-network] LoadBalancers ESIPP [Slow] test/e2e/framework/metrics/init/init.go:33 [DeferCleanup (Each)] [sig-network] LoadBalancers ESIPP [Slow] dump namespaces | framework.go:196 STEP: dump namespace information after failure 11/26/22 19:04:07.832 STEP: Collecting events from namespace "esipp-5381". 11/26/22 19:04:07.832 Nov 26 19:04:07.872: INFO: Unexpected error: failed to list events in namespace "esipp-5381": <*url.Error | 0xc002bdd7d0>: { Op: "Get", URL: "https://34.83.88.61/api/v1/namespaces/esipp-5381/events", Err: <*net.OpError | 0xc0041993b0>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc0037cde30>{ IP: [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 255, 255, 34, 83, 88, 61], Port: 443, Zone: "", }, Err: <*os.SyscallError | 0xc003f13a60>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Nov 26 19:04:07.872: FAIL: failed to list events in namespace "esipp-5381": Get "https://34.83.88.61/api/v1/namespaces/esipp-5381/events": dial tcp 34.83.88.61:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/framework/debug.dumpEventsInNamespace(0xc0010245c0, {0xc000df9250, 0xa}) test/e2e/framework/debug/dump.go:44 +0x191 k8s.io/kubernetes/test/e2e/framework/debug.DumpAllNamespaceInfo({0x801de88, 0xc002cf84e0}, {0xc000df9250, 0xa}) test/e2e/framework/debug/dump.go:62 +0x8d k8s.io/kubernetes/test/e2e/framework/debug/init.init.0.func1.1(0xc001024650?, {0xc000df9250?, 0x7fa7740?}) test/e2e/framework/debug/init/init.go:34 +0x32 k8s.io/kubernetes/test/e2e/framework.(*Framework).dumpNamespaceInfo.func1() test/e2e/framework/framework.go:274 +0x6d k8s.io/kubernetes/test/e2e/framework.(*Framework).dumpNamespaceInfo(0xc0011c4000) test/e2e/framework/framework.go:271 +0x179 reflect.Value.call({0x6627cc0?, 0xc00143a2d0?, 0x0?}, {0x75b6e72, 0x4}, {0xae73300, 0x0, 0x0?}) /usr/local/go/src/reflect/value.go:584 +0x8c5 reflect.Value.Call({0x6627cc0?, 0xc00143a2d0?, 0x0?}, {0xae73300?, 0x0?, 0x0?}) /usr/local/go/src/reflect/value.go:368 +0xbc [DeferCleanup (Each)] [sig-network] LoadBalancers ESIPP [Slow] tear down framework | framework.go:193 STEP: Destroying namespace "esipp-5381" for this suite. 11/26/22 19:04:07.872 Nov 26 19:04:07.913: FAIL: Couldn't delete ns: "esipp-5381": Delete "https://34.83.88.61/api/v1/namespaces/esipp-5381": dial tcp 34.83.88.61:443: connect: connection refused (&url.Error{Op:"Delete", URL:"https://34.83.88.61/api/v1/namespaces/esipp-5381", Err:(*net.OpError)(0xc001f0a370)}) Full Stack Trace k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach.func1() test/e2e/framework/framework.go:370 +0x4fe k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach(0xc0011c4000) test/e2e/framework/framework.go:383 +0x1ca reflect.Value.call({0x6627cc0?, 0xc00143a1d0?, 0xc0000f4fb0?}, {0x75b6e72, 0x4}, {0xae73300, 0x0, 0x0?}) /usr/local/go/src/reflect/value.go:584 +0x8c5 reflect.Value.Call({0x6627cc0?, 0xc00143a1d0?, 0x0?}, {0xae73300?, 0x5?, 0xc001173360?}) /usr/local/go/src/reflect/value.go:368 +0xbc
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-network\]\sLoadBalancers\sESIPP\s\[Slow\]\sshould\swork\sfor\stype\=NodePort$'
test/e2e/framework/framework.go:241 k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc000c06000) test/e2e/framework/framework.go:241 +0x96f There were additional failures detected after the initial failure: [PANICKED] Test Panicked In [AfterEach] at: /usr/local/go/src/runtime/panic.go:260 runtime error: invalid memory address or nil pointer dereference Full Stack Trace k8s.io/kubernetes/test/e2e/network.glob..func20.2() test/e2e/network/loadbalancer.go:1262 +0x113from junit_01.xml
[BeforeEach] [sig-network] LoadBalancers ESIPP [Slow] set up framework | framework.go:178 STEP: Creating a kubernetes client 11/26/22 18:55:56.162 Nov 26 18:55:56.162: INFO: >>> kubeConfig: /workspace/.kube/config STEP: Building a namespace api object, basename esipp 11/26/22 18:55:56.164 Nov 26 18:55:56.203: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:55:58.243: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:00.244: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:02.243: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:04.243: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:06.243: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:08.244: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:10.244: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:12.243: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:14.243: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:16.243: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:18.244: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:20.243: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:22.243: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:24.243: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:26.243: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:26.283: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:56:26.283: INFO: Unexpected error: <*errors.errorString | 0xc0001fda10>: { s: "timed out waiting for the condition", } Nov 26 18:56:26.283: FAIL: timed out waiting for the condition Full Stack Trace k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc000c06000) test/e2e/framework/framework.go:241 +0x96f [AfterEach] [sig-network] LoadBalancers ESIPP [Slow] test/e2e/framework/node/init/init.go:32 Nov 26 18:56:26.283: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready [AfterEach] [sig-network] LoadBalancers ESIPP [Slow] test/e2e/network/loadbalancer.go:1260 [DeferCleanup (Each)] [sig-network] LoadBalancers ESIPP [Slow] dump namespaces | framework.go:196 STEP: dump namespace information after failure 11/26/22 18:56:26.323 [DeferCleanup (Each)] [sig-network] LoadBalancers ESIPP [Slow] tear down framework | framework.go:193
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-network\]\sLoadBalancers\sESIPP\s\[Slow\]\sshould\swork\sfrom\spods$'
test/e2e/framework/framework.go:241 k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc000d42000) test/e2e/framework/framework.go:241 +0x96f There were additional failures detected after the initial failure: [PANICKED] Test Panicked In [AfterEach] at: /usr/local/go/src/runtime/panic.go:260 runtime error: invalid memory address or nil pointer dereference Full Stack Trace k8s.io/kubernetes/test/e2e/network.glob..func20.2() test/e2e/network/loadbalancer.go:1262 +0x113from junit_01.xml
[BeforeEach] [sig-network] LoadBalancers ESIPP [Slow] set up framework | framework.go:178 STEP: Creating a kubernetes client 11/26/22 18:52:49.413 Nov 26 18:52:49.413: INFO: >>> kubeConfig: /workspace/.kube/config STEP: Building a namespace api object, basename esipp 11/26/22 18:52:49.415 Nov 26 18:52:49.454: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:52:51.495: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:52:53.495: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:52:55.495: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:52:57.495: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:52:59.494: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:01.494: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:03.495: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:05.494: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:07.494: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:09.494: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:11.495: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:13.495: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:15.495: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:17.495: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:19.494: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:19.533: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:53:19.533: INFO: Unexpected error: <*errors.errorString | 0xc000115ca0>: { s: "timed out waiting for the condition", } Nov 26 18:53:19.533: FAIL: timed out waiting for the condition Full Stack Trace k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc000d42000) test/e2e/framework/framework.go:241 +0x96f [AfterEach] [sig-network] LoadBalancers ESIPP [Slow] test/e2e/framework/node/init/init.go:32 Nov 26 18:53:19.534: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready [AfterEach] [sig-network] LoadBalancers ESIPP [Slow] test/e2e/network/loadbalancer.go:1260 [DeferCleanup (Each)] [sig-network] LoadBalancers ESIPP [Slow] dump namespaces | framework.go:196 STEP: dump namespace information after failure 11/26/22 18:53:19.574 [DeferCleanup (Each)] [sig-network] LoadBalancers ESIPP [Slow] tear down framework | framework.go:193
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-network\]\sLoadBalancers\sshould\sbe\sable\sto\schange\sthe\stype\sand\sports\sof\sa\sTCP\sservice\s\[Slow\]$'
test/e2e/framework/service/util.go:48 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTPWithRetriableErrorCodes({0xc0037b6680, 0xe}, 0x7aae, {0xae73300, 0x0, 0x0}, 0x1?) test/e2e/framework/service/util.go:48 +0x265 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTP(...) test/e2e/framework/service/util.go:29 k8s.io/kubernetes/test/e2e/network.glob..func19.3() test/e2e/network/loadbalancer.go:120 +0x465 There were additional failures detected after the initial failure: [FAILED] Nov 26 18:56:25.104: failed to list events in namespace "loadbalancers-5282": Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-5282/events": dial tcp 34.83.88.61:443: connect: connection refused In [DeferCleanup (Each)] at: test/e2e/framework/debug/dump.go:44 ---------- [FAILED] Nov 26 18:56:25.144: Couldn't delete ns: "loadbalancers-5282": Delete "https://34.83.88.61/api/v1/namespaces/loadbalancers-5282": dial tcp 34.83.88.61:443: connect: connection refused (&url.Error{Op:"Delete", URL:"https://34.83.88.61/api/v1/namespaces/loadbalancers-5282", Err:(*net.OpError)(0xc002829090)}) In [DeferCleanup (Each)] at: test/e2e/framework/framework.go:370from junit_01.xml
[BeforeEach] [sig-network] LoadBalancers set up framework | framework.go:178 STEP: Creating a kubernetes client 11/26/22 18:49:43.365 Nov 26 18:49:43.365: INFO: >>> kubeConfig: /workspace/.kube/config STEP: Building a namespace api object, basename loadbalancers 11/26/22 18:49:43.367 Nov 26 18:49:43.406: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:45.446: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:47.446: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:49.446: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:51.446: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:53.446: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:55.445: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:57.446: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:59.446: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:50:01.446: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:50:03.446: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:50:05.446: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:50:07.446: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused STEP: Waiting for a default service account to be provisioned in namespace 11/26/22 18:51:17.589 STEP: Waiting for kube-root-ca.crt to be provisioned in namespace 11/26/22 18:51:17.67 [BeforeEach] [sig-network] LoadBalancers test/e2e/framework/metrics/init/init.go:31 [BeforeEach] [sig-network] LoadBalancers test/e2e/network/loadbalancer.go:65 [It] should be able to change the type and ports of a TCP service [Slow] test/e2e/network/loadbalancer.go:77 Nov 26 18:51:20.354: INFO: namespace for TCP test: loadbalancers-5282 STEP: creating a TCP service mutability-test with type=ClusterIP in namespace loadbalancers-5282 11/26/22 18:51:20.399 Nov 26 18:51:20.449: INFO: service port TCP: 80 STEP: creating a pod to be part of the TCP service mutability-test 11/26/22 18:51:20.449 Nov 26 18:51:20.494: INFO: Waiting up to 2m0s for 1 pods to be created Nov 26 18:51:20.537: INFO: Found all 1 pods Nov 26 18:51:20.537: INFO: Waiting up to 2m0s for 1 pods to be running and ready: [mutability-test-5bk8c] Nov 26 18:51:20.537: INFO: Waiting up to 2m0s for pod "mutability-test-5bk8c" in namespace "loadbalancers-5282" to be "running and ready" Nov 26 18:51:20.578: INFO: Pod "mutability-test-5bk8c": Phase="Pending", Reason="", readiness=false. Elapsed: 41.177465ms Nov 26 18:51:20.578: INFO: Error evaluating pod condition running and ready: want pod 'mutability-test-5bk8c' on 'bootstrap-e2e-minion-group-gnb8' to be 'Running' but was 'Pending' Nov 26 18:51:22.631: INFO: Pod "mutability-test-5bk8c": Phase="Pending", Reason="", readiness=false. Elapsed: 2.094354862s Nov 26 18:51:22.631: INFO: Error evaluating pod condition running and ready: want pod 'mutability-test-5bk8c' on 'bootstrap-e2e-minion-group-gnb8' to be 'Running' but was 'Pending' Nov 26 18:51:24.634: INFO: Pod "mutability-test-5bk8c": Phase="Running", Reason="", readiness=true. Elapsed: 4.096880224s Nov 26 18:51:24.634: INFO: Pod "mutability-test-5bk8c" satisfied condition "running and ready" Nov 26 18:51:24.634: INFO: Wanted all 1 pods to be running and ready. Result: true. Pods: [mutability-test-5bk8c] STEP: changing the TCP service to type=NodePort 11/26/22 18:51:24.634 Nov 26 18:51:24.763: INFO: TCP node port: 31406 STEP: hitting the TCP service's NodePort 11/26/22 18:51:24.763 Nov 26 18:51:24.763: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:51:24.802: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:51:26.804: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:51:26.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:51:28.804: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:51:28.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:51:30.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:51:30.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:51:32.804: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:51:32.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:51:34.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:51:34.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:51:36.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:51:36.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:51:38.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:51:38.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:51:40.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:51:40.845: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:51:42.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:51:42.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:51:44.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:51:44.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:51:46.804: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:51:46.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:51:48.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:51:48.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:51:50.804: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:51:50.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:51:52.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:51:52.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:51:54.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:51:54.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:51:56.804: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:51:56.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:51:58.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:51:58.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:52:00.805: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:52:00.844: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:52:02.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:52:02.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:52:04.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:52:04.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:52:06.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:52:06.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:52:08.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:52:08.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:52:10.804: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:52:10.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:52:12.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:52:12.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:52:14.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:52:14.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:52:16.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:52:16.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:52:18.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:52:18.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:52:20.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:52:20.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:52:22.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:52:22.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:52:24.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:52:24.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:52:26.804: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:52:26.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:52:28.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:52:28.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:52:30.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:52:30.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:52:32.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:52:32.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:52:34.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:52:34.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:52:36.804: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:52:36.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:52:38.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:52:38.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:52:40.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:52:40.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:52:42.804: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:52:42.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:52:44.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:52:44.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:52:46.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:52:46.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:52:48.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:52:48.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:52:50.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:52:50.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:52:52.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:52:52.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:52:54.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:52:54.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:52:56.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:52:56.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:52:58.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:52:58.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:53:00.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:53:00.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:53:02.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:53:02.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:53:04.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:53:04.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:53:06.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:53:06.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:53:08.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:53:08.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:53:10.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:53:10.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:53:12.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:53:12.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:53:14.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:53:14.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:53:16.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:53:16.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:53:18.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:53:18.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:53:20.804: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:53:20.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:53:22.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:53:22.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:53:24.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:53:24.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:53:26.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:53:26.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:53:28.804: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:53:28.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:53:30.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:53:30.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:53:32.804: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:53:32.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:53:34.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:53:34.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:53:36.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:53:36.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:53:38.804: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:53:38.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:53:40.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:53:40.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:53:42.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:53:42.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:53:44.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:53:44.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:53:46.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:53:46.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:53:48.804: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:53:48.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:53:50.804: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:53:50.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:53:52.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:53:52.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:53:54.804: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:53:54.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:53:56.804: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:53:56.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:53:58.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:53:58.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:54:00.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:54:00.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:54:02.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:54:02.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:54:04.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:54:04.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:54:06.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:54:06.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:54:08.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:54:08.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:54:10.804: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:54:10.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:54:12.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:54:12.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:54:14.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:54:14.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:54:16.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:54:16.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:54:18.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:54:18.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:54:20.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:54:20.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:54:22.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:54:22.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:54:24.804: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:54:24.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:54:26.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:54:26.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:54:28.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:54:28.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:54:30.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:54:30.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:54:32.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:54:32.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:54:34.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:54:34.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:54:36.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:54:36.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:54:38.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:54:38.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:54:40.804: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:54:40.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:54:42.804: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:54:42.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:54:44.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:54:44.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:54:46.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:54:46.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:54:48.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:54:48.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:54:50.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:54:50.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:54:52.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:54:52.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:54:54.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:54:54.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:54:56.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:54:56.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:54:58.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:54:58.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:55:00.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:55:00.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:55:02.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:55:02.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:55:04.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:55:04.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:55:06.804: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:55:06.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:55:08.804: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:55:08.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:55:10.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:55:10.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:55:12.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:55:12.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:55:14.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:55:14.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:55:16.804: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:55:16.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:55:18.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:55:18.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:55:20.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:55:20.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:55:22.804: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:55:22.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:55:24.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:55:24.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:55:26.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:55:26.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:55:28.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:55:28.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:55:30.804: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:55:30.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:55:32.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:55:32.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:55:34.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:55:34.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:55:36.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:55:36.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:55:38.804: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:55:38.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:55:40.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:55:40.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:55:42.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:55:42.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:55:44.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:55:44.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:55:46.804: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:55:46.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:55:48.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:55:48.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:55:50.804: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:55:50.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:55:52.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:55:52.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:55:54.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:55:54.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:55:56.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:55:56.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:55:58.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:55:58.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:56:00.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:56:00.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:56:02.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:56:02.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:56:04.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:56:04.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:56:06.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:56:06.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:56:08.804: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:56:08.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:56:10.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:56:10.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:56:12.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:56:12.843: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:56:14.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:56:14.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:56:16.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:56:16.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:56:18.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:56:18.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #11 Automatically polling progress: [sig-network] LoadBalancers should be able to change the type and ports of a TCP service [Slow] (Spec Runtime: 6m36.943s) test/e2e/network/loadbalancer.go:77 In [It] (Node Runtime: 5m0s) test/e2e/network/loadbalancer.go:77 At [By Step] hitting the TCP service's NodePort (Step Runtime: 4m55.545s) test/e2e/network/loadbalancer.go:119 Spec Goroutine goroutine 1740 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc00013c000}, 0xc00372e678, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc00013c000}, 0xd0?, 0x2fd9d05?, 0x28?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc00013c000}, 0x2d?, 0xc00433bc20?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0x754e980?, 0xc0011badf8?, 0x766a5c9?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTPWithRetriableErrorCodes({0xc0037b6680, 0xe}, 0x7aae, {0xae73300, 0x0, 0x0}, 0x1?) test/e2e/framework/service/util.go:46 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTP(...) test/e2e/framework/service/util.go:29 > k8s.io/kubernetes/test/e2e/network.glob..func19.3() test/e2e/network/loadbalancer.go:120 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0x2623318, 0x39e3d00}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 18:56:20.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:56:20.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:56:22.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:56:22.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:56:24.803: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:56:24.842: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:56:24.842: INFO: Poking "http://34.168.240.199:31406/echo?msg=hello" Nov 26 18:56:24.882: INFO: Poke("http://34.168.240.199:31406/echo?msg=hello"): Get "http://34.168.240.199:31406/echo?msg=hello": dial tcp 34.168.240.199:31406: connect: connection refused Nov 26 18:56:24.882: FAIL: Could not reach HTTP service through 34.168.240.199:31406 after 5m0s Full Stack Trace k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTPWithRetriableErrorCodes({0xc0037b6680, 0xe}, 0x7aae, {0xae73300, 0x0, 0x0}, 0x1?) test/e2e/framework/service/util.go:48 +0x265 k8s.io/kubernetes/test/e2e/framework/service.TestReachableHTTP(...) test/e2e/framework/service/util.go:29 k8s.io/kubernetes/test/e2e/network.glob..func19.3() test/e2e/network/loadbalancer.go:120 +0x465 [AfterEach] [sig-network] LoadBalancers test/e2e/framework/node/init/init.go:32 Nov 26 18:56:24.882: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready [AfterEach] [sig-network] LoadBalancers test/e2e/network/loadbalancer.go:71 Nov 26 18:56:24.922: INFO: Output of kubectl describe svc: Nov 26 18:56:24.922: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=loadbalancers-5282 describe svc --namespace=loadbalancers-5282' Nov 26 18:56:25.063: INFO: rc: 1 Nov 26 18:56:25.063: INFO: [DeferCleanup (Each)] [sig-network] LoadBalancers test/e2e/framework/metrics/init/init.go:33 [DeferCleanup (Each)] [sig-network] LoadBalancers dump namespaces | framework.go:196 STEP: dump namespace information after failure 11/26/22 18:56:25.064 STEP: Collecting events from namespace "loadbalancers-5282". 11/26/22 18:56:25.064 Nov 26 18:56:25.104: INFO: Unexpected error: failed to list events in namespace "loadbalancers-5282": <*url.Error | 0xc004e3e9c0>: { Op: "Get", URL: "https://34.83.88.61/api/v1/namespaces/loadbalancers-5282/events", Err: <*net.OpError | 0xc002828d70>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc00467acc0>{ IP: [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 255, 255, 34, 83, 88, 61], Port: 443, Zone: "", }, Err: <*os.SyscallError | 0xc0011caca0>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Nov 26 18:56:25.104: FAIL: failed to list events in namespace "loadbalancers-5282": Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-5282/events": dial tcp 34.83.88.61:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/framework/debug.dumpEventsInNamespace(0xc00053e5c0, {0xc00372e540, 0x12}) test/e2e/framework/debug/dump.go:44 +0x191 k8s.io/kubernetes/test/e2e/framework/debug.DumpAllNamespaceInfo({0x801de88, 0xc00285fba0}, {0xc00372e540, 0x12}) test/e2e/framework/debug/dump.go:62 +0x8d k8s.io/kubernetes/test/e2e/framework/debug/init.init.0.func1.1(0xc00053e650?, {0xc00372e540?, 0x7fa7740?}) test/e2e/framework/debug/init/init.go:34 +0x32 k8s.io/kubernetes/test/e2e/framework.(*Framework).dumpNamespaceInfo.func1() test/e2e/framework/framework.go:274 +0x6d k8s.io/kubernetes/test/e2e/framework.(*Framework).dumpNamespaceInfo(0xc0011ea4b0) test/e2e/framework/framework.go:271 +0x179 reflect.Value.call({0x6627cc0?, 0xc00003fce0?, 0xf?}, {0x75b6e72, 0x4}, {0xae73300, 0x0, 0x5?}) /usr/local/go/src/reflect/value.go:584 +0x8c5 reflect.Value.Call({0x6627cc0?, 0xc00003fce0?, 0xc002f89530?}, {0xae73300?, 0xc003d9bf68?, 0x2a6d786?}) /usr/local/go/src/reflect/value.go:368 +0xbc [DeferCleanup (Each)] [sig-network] LoadBalancers tear down framework | framework.go:193 STEP: Destroying namespace "loadbalancers-5282" for this suite. 11/26/22 18:56:25.104 Nov 26 18:56:25.144: FAIL: Couldn't delete ns: "loadbalancers-5282": Delete "https://34.83.88.61/api/v1/namespaces/loadbalancers-5282": dial tcp 34.83.88.61:443: connect: connection refused (&url.Error{Op:"Delete", URL:"https://34.83.88.61/api/v1/namespaces/loadbalancers-5282", Err:(*net.OpError)(0xc002829090)}) Full Stack Trace k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach.func1() test/e2e/framework/framework.go:370 +0x4fe k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach(0xc0011ea4b0) test/e2e/framework/framework.go:383 +0x1ca reflect.Value.call({0x6627cc0?, 0xc00003fc00?, 0xc000a11fb0?}, {0x75b6e72, 0x4}, {0xae73300, 0x0, 0x0?}) /usr/local/go/src/reflect/value.go:584 +0x8c5 reflect.Value.Call({0x6627cc0?, 0xc00003fc00?, 0x0?}, {0xae73300?, 0x4?, 0xc003fcb3f8?}) /usr/local/go/src/reflect/value.go:368 +0xbc
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-network\]\sLoadBalancers\sshould\sbe\sable\sto\screate\sLoadBalancer\sService\swithout\sNodePort\sand\schange\sit\s\[Slow\]$'
test/e2e/framework/framework.go:241 k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc0011fc4b0) test/e2e/framework/framework.go:241 +0x96f There were additional failures detected after the initial failure: [PANICKED] Test Panicked In [AfterEach] at: /usr/local/go/src/runtime/panic.go:260 runtime error: invalid memory address or nil pointer dereference Full Stack Trace k8s.io/kubernetes/test/e2e/network.glob..func19.2() test/e2e/network/loadbalancer.go:73 +0x113from junit_01.xml
[BeforeEach] [sig-network] LoadBalancers set up framework | framework.go:178 STEP: Creating a kubernetes client 11/26/22 18:54:19.132 Nov 26 18:54:19.132: INFO: >>> kubeConfig: /workspace/.kube/config STEP: Building a namespace api object, basename loadbalancers 11/26/22 18:54:19.134 Nov 26 18:54:19.173: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:21.212: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:23.212: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:25.213: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:27.214: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:29.213: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:31.213: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:33.213: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:35.214: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:37.213: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:39.213: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:41.213: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:43.213: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:48.002: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": read tcp 10.60.55.171:41390->34.83.88.61:443: read: connection reset by peer Nov 26 18:54:49.214: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:49.256: INFO: Unexpected error while creating namespace: Post "https://34.83.88.61/api/v1/namespaces": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:54:49.256: INFO: Unexpected error: <*errors.errorString | 0xc0001c1a00>: { s: "timed out waiting for the condition", } Nov 26 18:54:49.256: FAIL: timed out waiting for the condition Full Stack Trace k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc0011fc4b0) test/e2e/framework/framework.go:241 +0x96f [AfterEach] [sig-network] LoadBalancers test/e2e/framework/node/init/init.go:32 Nov 26 18:54:49.257: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready [AfterEach] [sig-network] LoadBalancers test/e2e/network/loadbalancer.go:71 [DeferCleanup (Each)] [sig-network] LoadBalancers dump namespaces | framework.go:196 STEP: dump namespace information after failure 11/26/22 18:54:49.296 [DeferCleanup (Each)] [sig-network] LoadBalancers tear down framework | framework.go:193
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-network\]\sLoadBalancers\sshould\sbe\sable\sto\screate\san\sinternal\stype\sload\sbalancer\s\[Slow\]$'
test/e2e/network/service.go:4058 k8s.io/kubernetes/test/e2e/network.launchHostExecPod({0x801de88, 0xc00163b040}, {0xc00386a600, 0x12}, {0x75d7031, 0xd}) test/e2e/network/service.go:4058 +0x1bd k8s.io/kubernetes/test/e2e/network.glob..func19.6() test/e2e/network/loadbalancer.go:618 +0x485 There were additional failures detected after the initial failure: [FAILED] Nov 26 18:52:49.340: failed to list events in namespace "loadbalancers-9894": Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/events": dial tcp 34.83.88.61:443: connect: connection refused In [DeferCleanup (Each)] at: test/e2e/framework/debug/dump.go:44 ---------- [FAILED] Nov 26 18:52:49.380: Couldn't delete ns: "loadbalancers-9894": Delete "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894": dial tcp 34.83.88.61:443: connect: connection refused (&url.Error{Op:"Delete", URL:"https://34.83.88.61/api/v1/namespaces/loadbalancers-9894", Err:(*net.OpError)(0xc000d7da90)}) In [DeferCleanup (Each)] at: test/e2e/framework/framework.go:370from junit_01.xml
[BeforeEach] [sig-network] LoadBalancers set up framework | framework.go:178 STEP: Creating a kubernetes client 11/26/22 18:47:13.974 Nov 26 18:47:13.974: INFO: >>> kubeConfig: /workspace/.kube/config STEP: Building a namespace api object, basename loadbalancers 11/26/22 18:47:13.976 STEP: Waiting for a default service account to be provisioned in namespace 11/26/22 18:47:14.118 STEP: Waiting for kube-root-ca.crt to be provisioned in namespace 11/26/22 18:47:14.216 [BeforeEach] [sig-network] LoadBalancers test/e2e/framework/metrics/init/init.go:31 [BeforeEach] [sig-network] LoadBalancers test/e2e/network/loadbalancer.go:65 [It] should be able to create an internal type load balancer [Slow] test/e2e/network/loadbalancer.go:571 STEP: creating pod to be part of service lb-internal 11/26/22 18:47:14.4 Nov 26 18:47:14.451: INFO: Waiting up to 2m0s for 1 pods to be created Nov 26 18:47:14.497: INFO: Found all 1 pods Nov 26 18:47:14.497: INFO: Waiting up to 2m0s for 1 pods to be running and ready: [lb-internal-fmnnv] Nov 26 18:47:14.497: INFO: Waiting up to 2m0s for pod "lb-internal-fmnnv" in namespace "loadbalancers-9894" to be "running and ready" Nov 26 18:47:14.542: INFO: Pod "lb-internal-fmnnv": Phase="Pending", Reason="", readiness=false. Elapsed: 44.629818ms Nov 26 18:47:14.542: INFO: Error evaluating pod condition running and ready: want pod 'lb-internal-fmnnv' on 'bootstrap-e2e-minion-group-dzls' to be 'Running' but was 'Pending' Nov 26 18:47:16.589: INFO: Pod "lb-internal-fmnnv": Phase="Pending", Reason="", readiness=false. Elapsed: 2.091852956s Nov 26 18:47:16.589: INFO: Error evaluating pod condition running and ready: want pod 'lb-internal-fmnnv' on 'bootstrap-e2e-minion-group-dzls' to be 'Running' but was 'Pending' Nov 26 18:47:18.592: INFO: Pod "lb-internal-fmnnv": Phase="Running", Reason="", readiness=true. Elapsed: 4.093998813s Nov 26 18:47:18.592: INFO: Pod "lb-internal-fmnnv" satisfied condition "running and ready" Nov 26 18:47:18.592: INFO: Wanted all 1 pods to be running and ready. Result: true. Pods: [lb-internal-fmnnv] STEP: creating a service with type LoadBalancer and cloud specific Internal-LB annotation enabled 11/26/22 18:47:18.592 Nov 26 18:47:18.664: INFO: Waiting up to 15m0s for service "lb-internal" to have a LoadBalancer Nov 26 18:49:10.745: INFO: Retrying .... error trying to get Service lb-internal: Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/services/lb-internal": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:12.745: INFO: Retrying .... error trying to get Service lb-internal: Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/services/lb-internal": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:14.745: INFO: Retrying .... error trying to get Service lb-internal: Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/services/lb-internal": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:16.745: INFO: Retrying .... error trying to get Service lb-internal: Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/services/lb-internal": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:18.745: INFO: Retrying .... error trying to get Service lb-internal: Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/services/lb-internal": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:20.745: INFO: Retrying .... error trying to get Service lb-internal: Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/services/lb-internal": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:22.744: INFO: Retrying .... error trying to get Service lb-internal: Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/services/lb-internal": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:24.745: INFO: Retrying .... error trying to get Service lb-internal: Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/services/lb-internal": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:26.745: INFO: Retrying .... error trying to get Service lb-internal: Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/services/lb-internal": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:28.745: INFO: Retrying .... error trying to get Service lb-internal: Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/services/lb-internal": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:30.745: INFO: Retrying .... error trying to get Service lb-internal: Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/services/lb-internal": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:32.745: INFO: Retrying .... error trying to get Service lb-internal: Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/services/lb-internal": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:34.745: INFO: Retrying .... error trying to get Service lb-internal: Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/services/lb-internal": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:36.745: INFO: Retrying .... error trying to get Service lb-internal: Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/services/lb-internal": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:38.746: INFO: Retrying .... error trying to get Service lb-internal: Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/services/lb-internal": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:40.745: INFO: Retrying .... error trying to get Service lb-internal: Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/services/lb-internal": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:42.746: INFO: Retrying .... error trying to get Service lb-internal: Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/services/lb-internal": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:44.745: INFO: Retrying .... error trying to get Service lb-internal: Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/services/lb-internal": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:46.746: INFO: Retrying .... error trying to get Service lb-internal: Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/services/lb-internal": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:48.745: INFO: Retrying .... error trying to get Service lb-internal: Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/services/lb-internal": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:50.745: INFO: Retrying .... error trying to get Service lb-internal: Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/services/lb-internal": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:52.745: INFO: Retrying .... error trying to get Service lb-internal: Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/services/lb-internal": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:54.746: INFO: Retrying .... error trying to get Service lb-internal: Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/services/lb-internal": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:56.745: INFO: Retrying .... error trying to get Service lb-internal: Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/services/lb-internal": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:49:58.744: INFO: Retrying .... error trying to get Service lb-internal: Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/services/lb-internal": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:50:00.744: INFO: Retrying .... error trying to get Service lb-internal: Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/services/lb-internal": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:50:02.746: INFO: Retrying .... error trying to get Service lb-internal: Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/services/lb-internal": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:50:04.745: INFO: Retrying .... error trying to get Service lb-internal: Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/services/lb-internal": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:50:06.745: INFO: Retrying .... error trying to get Service lb-internal: Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/services/lb-internal": dial tcp 34.83.88.61:443: connect: connection refused ------------------------------ Progress Report for Ginkgo Process #25 Automatically polling progress: [sig-network] LoadBalancers should be able to create an internal type load balancer [Slow] (Spec Runtime: 5m0.374s) test/e2e/network/loadbalancer.go:571 In [It] (Node Runtime: 5m0s) test/e2e/network/loadbalancer.go:571 At [By Step] creating a service with type LoadBalancer and cloud specific Internal-LB annotation enabled (Step Runtime: 4m55.756s) test/e2e/network/loadbalancer.go:593 Spec Goroutine goroutine 977 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc00012e000}, 0xc00386ae10, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc00012e000}, 0x30?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc00012e000}, 0xc00022cc60?, 0xc004ebbb80?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0x0?, 0x7fa7740?, 0xc000220d00?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001b648c0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001b648c0, 0x0?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/network.glob..func19.6() test/e2e/network/loadbalancer.go:605 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0x2d591ce, 0xc000da4000}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ ------------------------------ Progress Report for Ginkgo Process #25 Automatically polling progress: [sig-network] LoadBalancers should be able to create an internal type load balancer [Slow] (Spec Runtime: 5m20.377s) test/e2e/network/loadbalancer.go:571 In [It] (Node Runtime: 5m20.003s) test/e2e/network/loadbalancer.go:571 At [By Step] creating a service with type LoadBalancer and cloud specific Internal-LB annotation enabled (Step Runtime: 5m15.759s) test/e2e/network/loadbalancer.go:593 Spec Goroutine goroutine 977 [select] k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc00012e000}, 0xc00386ae10, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:660 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc00012e000}, 0x30?, 0x2fd9d05?, 0x20?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc00012e000}, 0xc00022cc60?, 0xc004ebbb80?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0x0?, 0x7fa7740?, 0xc000220d00?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).waitForCondition(0xc001b648c0, 0x4?, {0x7600fe2, 0x14}, 0x7895b68) test/e2e/framework/service/jig.go:631 k8s.io/kubernetes/test/e2e/framework/service.(*TestJig).WaitForLoadBalancer(0xc001b648c0, 0x0?) test/e2e/framework/service/jig.go:582 > k8s.io/kubernetes/test/e2e/network.glob..func19.6() test/e2e/network/loadbalancer.go:605 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0x2d591ce, 0xc000da4000}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ STEP: hitting the internal load balancer from pod 11/26/22 18:52:46.798 Nov 26 18:52:46.798: INFO: creating pod with host network Nov 26 18:52:46.798: INFO: Creating new host exec pod Nov 26 18:52:46.886: INFO: Waiting up to 5m0s for pod "ilb-host-exec" in namespace "loadbalancers-9894" to be "running and ready" Nov 26 18:52:47.055: INFO: Pod "ilb-host-exec": Phase="Pending", Reason="", readiness=false. Elapsed: 169.397264ms Nov 26 18:52:47.055: INFO: The phase of Pod ilb-host-exec is Pending, waiting for it to be Running (with Ready = true) Nov 26 18:52:49.095: INFO: Encountered non-retryable error while getting pod loadbalancers-9894/ilb-host-exec: Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/pods/ilb-host-exec": dial tcp 34.83.88.61:443: connect: connection refused Nov 26 18:52:49.095: INFO: Unexpected error: <*fmt.wrapError | 0xc000397680>: { msg: "error while waiting for pod loadbalancers-9894/ilb-host-exec to be running and ready: Get \"https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/pods/ilb-host-exec\": dial tcp 34.83.88.61:443: connect: connection refused", err: <*url.Error | 0xc003583fb0>{ Op: "Get", URL: "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/pods/ilb-host-exec", Err: <*net.OpError | 0xc002bd9c70>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc00387bd10>{ IP: [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 255, 255, 34, 83, 88, 61], Port: 443, Zone: "", }, Err: <*os.SyscallError | 0xc000397640>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, }, } Nov 26 18:52:49.095: FAIL: error while waiting for pod loadbalancers-9894/ilb-host-exec to be running and ready: Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/pods/ilb-host-exec": dial tcp 34.83.88.61:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/network.launchHostExecPod({0x801de88, 0xc00163b040}, {0xc00386a600, 0x12}, {0x75d7031, 0xd}) test/e2e/network/service.go:4058 +0x1bd k8s.io/kubernetes/test/e2e/network.glob..func19.6() test/e2e/network/loadbalancer.go:618 +0x485 STEP: Clean up loadbalancer service 11/26/22 18:52:49.095 STEP: Delete service with finalizer 11/26/22 18:52:49.095 Nov 26 18:52:49.135: FAIL: Failed to delete service loadbalancers-9894/lb-internal Full Stack Trace k8s.io/kubernetes/test/e2e/framework/service.WaitForServiceDeletedWithFinalizer({0x801de88, 0xc00163b040}, {0xc00386aa98, 0x12}, {0xc002c8ee70, 0xb}) test/e2e/framework/service/wait.go:37 +0x185 k8s.io/kubernetes/test/e2e/network.glob..func19.6.3() test/e2e/network/loadbalancer.go:602 +0x67 panic({0x70eb7e0, 0xc00030afc0}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework.Fail({0xc0002fc0e0, 0xde}, {0xc0015cdc20?, 0xc0002fc0e0?, 0xc0015cdc48?}) test/e2e/framework/log.go:61 +0x145 k8s.io/kubernetes/test/e2e/framework.ExpectNoErrorWithOffset(0x1, {0x7fa3f20, 0xc000397680}, {0x0?, 0xc00386a600?, 0x0?}) test/e2e/framework/expect.go:76 +0x267 k8s.io/kubernetes/test/e2e/framework.ExpectNoError(...) test/e2e/framework/expect.go:43 k8s.io/kubernetes/test/e2e/network.launchHostExecPod({0x801de88, 0xc00163b040}, {0xc00386a600, 0x12}, {0x75d7031, 0xd}) test/e2e/network/service.go:4058 +0x1bd k8s.io/kubernetes/test/e2e/network.glob..func19.6() test/e2e/network/loadbalancer.go:618 +0x485 [AfterEach] [sig-network] LoadBalancers test/e2e/framework/node/init/init.go:32 Nov 26 18:52:49.135: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready [AfterEach] [sig-network] LoadBalancers test/e2e/network/loadbalancer.go:71 Nov 26 18:52:49.175: INFO: Output of kubectl describe svc: Nov 26 18:52:49.175: INFO: Running '/workspace/github.com/containerd/containerd/kubernetes/platforms/linux/amd64/kubectl --server=https://34.83.88.61 --kubeconfig=/workspace/.kube/config --namespace=loadbalancers-9894 describe svc --namespace=loadbalancers-9894' Nov 26 18:52:49.300: INFO: rc: 1 Nov 26 18:52:49.300: INFO: [DeferCleanup (Each)] [sig-network] LoadBalancers test/e2e/framework/metrics/init/init.go:33 [DeferCleanup (Each)] [sig-network] LoadBalancers dump namespaces | framework.go:196 STEP: dump namespace information after failure 11/26/22 18:52:49.3 STEP: Collecting events from namespace "loadbalancers-9894". 11/26/22 18:52:49.3 Nov 26 18:52:49.340: INFO: Unexpected error: failed to list events in namespace "loadbalancers-9894": <*url.Error | 0xc005042f60>: { Op: "Get", URL: "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/events", Err: <*net.OpError | 0xc00504cf50>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc000d7f110>{ IP: [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 255, 255, 34, 83, 88, 61], Port: 443, Zone: "", }, Err: <*os.SyscallError | 0xc0016a3000>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Nov 26 18:52:49.340: FAIL: failed to list events in namespace "loadbalancers-9894": Get "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894/events": dial tcp 34.83.88.61:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/framework/debug.dumpEventsInNamespace(0xc0015cc5c0, {0xc00386a600, 0x12}) test/e2e/framework/debug/dump.go:44 +0x191 k8s.io/kubernetes/test/e2e/framework/debug.DumpAllNamespaceInfo({0x801de88, 0xc00163b040}, {0xc00386a600, 0x12}) test/e2e/framework/debug/dump.go:62 +0x8d k8s.io/kubernetes/test/e2e/framework/debug/init.init.0.func1.1(0xc0015cc650?, {0xc00386a600?, 0x7fa7740?}) test/e2e/framework/debug/init/init.go:34 +0x32 k8s.io/kubernetes/test/e2e/framework.(*Framework).dumpNamespaceInfo.func1() test/e2e/framework/framework.go:274 +0x6d k8s.io/kubernetes/test/e2e/framework.(*Framework).dumpNamespaceInfo(0xc000d8c4b0) test/e2e/framework/framework.go:271 +0x179 reflect.Value.call({0x6627cc0?, 0xc004eb6140?, 0xc004254f50?}, {0x75b6e72, 0x4}, {0xae73300, 0x0, 0x0?}) /usr/local/go/src/reflect/value.go:584 +0x8c5 reflect.Value.Call({0x6627cc0?, 0xc004eb6140?, 0x7fadfa0?}, {0xae73300?, 0xc004254f80?, 0x26225bd?}) /usr/local/go/src/reflect/value.go:368 +0xbc [DeferCleanup (Each)] [sig-network] LoadBalancers tear down framework | framework.go:193 STEP: Destroying namespace "loadbalancers-9894" for this suite. 11/26/22 18:52:49.341 Nov 26 18:52:49.380: FAIL: Couldn't delete ns: "loadbalancers-9894": Delete "https://34.83.88.61/api/v1/namespaces/loadbalancers-9894": dial tcp 34.83.88.61:443: connect: connection refused (&url.Error{Op:"Delete", URL:"https://34.83.88.61/api/v1/namespaces/loadbalancers-9894", Err:(*net.OpError)(0xc000d7da90)}) Full Stack Trace k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach.func1() test/e2e/framework/framework.go:370 +0x4fe k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach(0xc000d8c4b0) test/e2e/framework/framework.go:383 +0x1ca reflect.Value.call({0x6627cc0?, 0xc004eb60c0?, 0xc0020cafb0?}, {0x75b6e72, 0x4}, {0xae73300, 0x0, 0x0?}) /usr/local/go/src/reflect/value.go:584 +0x8c5 reflect.Value.Call({0x6627cc0?, 0xc004eb60c0?, 0x0?}, {0xae73300?, 0x5?, 0xc000de2a40?}) /usr/local/go/src/reflect/value.go:368 +0xbc
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-network\]\sLoadBalancers\sshould\sbe\sable\sto\sswitch\ssession\saffinity\sfor\sLoadBalancer\sservice\swith\sESIPP\son\s\[Slow\]\s\[LinuxOnly\]$'
test/e2e/network/service.go:263 k8s.io/kubernetes/test/e2e/network.checkAffinityFailed({{0xc003f6c000?, 0xc0029d6b60?, 0xc0020a2fb0?}}, {0x7654c95, 0x20}) test/e2e/network/service.go:263 +0x85 k8s.io/kubernetes/test/e2e/network.checkAffinity({0x801de88?, 0xc0029d6b60?}, 0x0?, {0xc0020a2fb0?, 0x7638a85?}, 0x7f8aa84?, 0x0) test/e2e/network/service.go:224 +0x27b k8s.io/kubernetes/test/e2e/network.execAffinityTestForLBServiceWithOptionalTransition(0x7638a85?, {0x801de88, 0xc0029d6b60}, 0xc0004c7180, 0x1) test/e2e/network/service.go:4002 +0x44b k8s.io/kubernetes/test/e2e/network.execAffinityTestForLBServiceWithTransition(...) test/e2e/network/service.go:3962 k8s.io/kubernetes/test/e2e/network.glob..func19.9() test/e2e/network/loadbalancer.go:787 +0xf3
[BeforeEach] [sig-network] LoadBalancers set up framework | framework.go:178 STEP: Creating a kubernetes client 11/26/22 18:45:16.405 Nov 26 18:45:16.405: INFO: >>> kubeConfig: /workspace/.kube/config STEP: Building a namespace api object, basename loadbalancers 11/26/22 18:45:16.406 STEP: Waiting for a default service account to be provisioned in namespace 11/26/22 18:45:16.566 STEP: Waiting for kube-root-ca.crt to be provisioned in namespace 11/26/22 18:45:16.658 [BeforeEach] [sig-network] LoadBalancers test/e2e/framework/metrics/init/init.go:31 [BeforeEach] [sig-network] LoadBalancers test/e2e/network/loadbalancer.go:65 [It] should be able to switch session affinity for LoadBalancer service with ESIPP on [Slow] [LinuxOnly] test/e2e/network/loadbalancer.go:780 STEP: creating service in namespace loadbalancers-5615 11/26/22 18:45:16.829 STEP: creating service affinity-lb-esipp-transition in namespace loadbalancers-5615 11/26/22 18:45:16.829 STEP: creating replication controller affinity-lb-esipp-transition in namespace loadbalancers-5615 11/26/22 18:45:16.981 I1126 18:45:17.055021 10221 runners.go:193] Created replication controller with name: affinity-lb-esipp-transition, namespace: loadbalancers-5615, replica count: 3 I1126 18:45:20.157224 10221 runners.go:193] affinity-lb-esipp-transition Pods: 3 out of 3 created, 1 running, 2 pending, 0 waiting, 0 inactive, 0 terminating, 0 unknown, 0 runningButNotReady I1126 18:45:23.158134 10221 runners.go:193] affinity-lb-esipp-transition Pods: 3 out of 3 created, 3 running, 0 pending, 0 waiting, 0 inactive, 0 terminating, 0 unknown, 0 runningButNotReady STEP: waiting for loadbalancer for service loadbalancers-5615/affinity-lb-esipp-transition 11/26/22 18:45:23.208 Nov 26 18:45:23.347: INFO: Waiting up to 15m0s for service "affinity-lb-esipp-transition" to have a LoadBalancer Nov 26 18:48:17.661: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:19.662: INFO: Poke("http://34.127.8.109:80"): Get "http://34.127.8.109:80": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Nov 26 18:48:19.662: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:19.702: INFO: Poke("http://34.127.8.109:80"): Get "http://34.127.8.109:80": dial tcp 34.127.8.109:80: connect: connection refused Nov 26 18:48:19.702: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:19.741: INFO: Poke("http://34.127.8.109:80"): Get "http://34.127.8.109:80": dial tcp 34.127.8.109:80: connect: connection refused Nov 26 18:48:19.741: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:19.780: INFO: Poke("http://34.127.8.109:80"): Get "http://34.127.8.109:80": dial tcp 34.127.8.109:80: connect: connection refused Nov 26 18:48:19.780: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:19.820: INFO: Poke("http://34.127.8.109:80"): Get "http://34.127.8.109:80": dial tcp 34.127.8.109:80: connect: connection refused Nov 26 18:48:19.820: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:19.859: INFO: Poke("http://34.127.8.109:80"): Get "http://34.127.8.109:80": dial tcp 34.127.8.109:80: connect: connection refused Nov 26 18:48:19.859: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:19.899: INFO: Poke("http://34.127.8.109:80"): Get "http://34.127.8.109:80": dial tcp 34.127.8.109:80: connect: connection refused Nov 26 18:48:19.899: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:19.939: INFO: Poke("http://34.127.8.109:80"): Get "http://34.127.8.109:80": dial tcp 34.127.8.109:80: connect: connection refused Nov 26 18:48:19.939: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:19.978: INFO: Poke("http://34.127.8.109:80"): Get "http://34.127.8.109:80": dial tcp 34.127.8.109:80: connect: connection refused Nov 26 18:48:19.978: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:20.017: INFO: Poke("http://34.127.8.109:80"): Get "http://34.127.8.109:80": dial tcp 34.127.8.109:80: connect: connection refused Nov 26 18:48:20.017: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:20.057: INFO: Poke("http://34.127.8.109:80"): Get "http://34.127.8.109:80": dial tcp 34.127.8.109:80: connect: connection refused Nov 26 18:48:20.057: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:20.096: INFO: Poke("http://34.127.8.109:80"): Get "http://34.127.8.109:80": dial tcp 34.127.8.109:80: connect: connection refused Nov 26 18:48:20.096: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:20.136: INFO: Poke("http://34.127.8.109:80"): Get "http://34.127.8.109:80": dial tcp 34.127.8.109:80: connect: connection refused Nov 26 18:48:20.136: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:20.175: INFO: Poke("http://34.127.8.109:80"): Get "http://34.127.8.109:80": dial tcp 34.127.8.109:80: connect: connection refused Nov 26 18:48:20.175: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:20.215: INFO: Poke("http://34.127.8.109:80"): Get "http://34.127.8.109:80": dial tcp 34.127.8.109:80: connect: connection refused Nov 26 18:48:22.216: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:22.295: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:22.295: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:22.373: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:22.373: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:22.452: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:22.452: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:22.531: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:22.531: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:22.609: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:22.609: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:22.688: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:22.688: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:22.766: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:22.766: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:22.845: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:22.845: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:22.923: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:22.923: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:23.001: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:23.001: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:23.080: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:23.080: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:23.158: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:23.158: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:23.237: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:23.237: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:23.315: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:23.316: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:23.394: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:23.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:23.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:23.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:23.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:23.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:23.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:23.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:23.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:23.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:23.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:23.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:23.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:23.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:23.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:23.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:24.216: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:24.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:24.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:24.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:24.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:24.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:24.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:24.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:24.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:24.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:24.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:24.686: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:24.686: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:24.765: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:24.765: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:24.844: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:24.844: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:24.922: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:24.922: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:25.001: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:25.001: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:25.079: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:25.079: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:25.158: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:25.158: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:25.236: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:25.236: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:25.314: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:25.314: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:25.393: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:25.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:25.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:25.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:25.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:25.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:25.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:25.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:25.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:25.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:25.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:25.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:25.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:25.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:25.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:25.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:26.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:26.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:26.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:26.373: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:26.373: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:26.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:26.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:26.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:26.530: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:26.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:26.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:26.687: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:26.687: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:26.767: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:26.767: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:26.845: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:26.845: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:26.924: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:26.924: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:27.002: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:27.002: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:27.081: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:27.081: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:27.159: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:27.159: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:27.237: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:27.237: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:27.316: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:27.316: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:27.394: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:27.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:27.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:27.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:27.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:27.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:27.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:27.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:27.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:27.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:27.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:27.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:27.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:27.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:27.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:27.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:28.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:28.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:28.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:28.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:28.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:28.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:28.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:28.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:28.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:28.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:28.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:28.686: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:28.687: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:28.765: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:28.765: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:28.844: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:28.844: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:28.922: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:28.922: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:29.001: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:29.001: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:29.079: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:29.079: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:29.158: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:29.158: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:29.236: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:29.236: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:29.315: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:29.315: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:29.393: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:29.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:29.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:29.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:29.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:29.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:29.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:29.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:29.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:29.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:29.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:29.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:29.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:29.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:29.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:29.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:30.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:30.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:30.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:30.373: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:30.373: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:30.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:30.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:30.530: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:30.530: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:30.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:30.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:30.686: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:30.686: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:30.765: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:30.765: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:30.844: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:30.844: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:30.923: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:30.923: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:31.001: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:31.001: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:31.079: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:31.079: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:31.158: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:31.158: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:31.237: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:31.237: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:31.315: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:31.315: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:31.393: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:31.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:31.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:31.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:31.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:31.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:31.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:31.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:31.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:31.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:31.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:31.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:31.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:31.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:31.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:31.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:32.216: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:32.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:32.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:32.373: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:32.373: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:32.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:32.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:32.530: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:32.530: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:32.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:32.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:32.687: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:32.687: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:32.765: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:32.765: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:32.843: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:32.843: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:32.922: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:32.922: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:33.000: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:33.000: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:33.078: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:33.078: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:33.157: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:33.157: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:33.235: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:33.235: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:33.314: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:33.314: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:33.392: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:33.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:33.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:33.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:33.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:33.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:33.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:33.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:33.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:33.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:33.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:33.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:33.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:33.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:33.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:33.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:34.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:34.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:34.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:34.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:34.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:34.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:34.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:34.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:34.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:34.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:34.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:34.687: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:34.687: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:34.765: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:34.765: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:34.844: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:34.844: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:34.922: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:34.922: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:35.001: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:35.001: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:35.079: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:35.079: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:35.157: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:35.157: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:35.236: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:35.236: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:35.315: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:35.315: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:35.393: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:35.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:35.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:35.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:35.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:35.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:35.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:35.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:35.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:35.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:35.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:35.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:35.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:35.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:35.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:35.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:36.216: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:36.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:36.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:36.373: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:36.373: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:36.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:36.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:36.530: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:36.530: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:36.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:36.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:36.687: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:36.687: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:36.766: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:36.766: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:36.844: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:36.844: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:36.924: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:36.924: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:37.003: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:37.003: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:37.081: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:37.081: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:37.160: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:37.160: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:37.241: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:37.241: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:37.319: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:37.319: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:37.398: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:37.398: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:37.398: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:37.398: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:37.398: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:37.398: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:37.398: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:37.398: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:37.398: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:37.398: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:37.398: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:37.398: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:37.398: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:37.398: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:37.398: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:37.398: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:38.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:38.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:38.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:38.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:38.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:38.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:38.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:38.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:38.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:38.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:38.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:38.687: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:38.687: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:38.765: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:38.765: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:38.844: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:38.844: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:38.922: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:38.922: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:39.000: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:39.000: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:39.079: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:39.079: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:39.157: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:39.157: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:39.236: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:39.236: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:39.314: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:39.314: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:39.393: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:39.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:39.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:39.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:39.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:39.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:39.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:39.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:39.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:39.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:39.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:39.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:39.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:39.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:39.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:39.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:40.216: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:40.295: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:40.295: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:40.373: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:40.374: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:40.452: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:40.452: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:40.531: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:40.531: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:40.609: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:40.609: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:40.693: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:40.693: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:40.772: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:40.772: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:40.850: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:40.850: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:40.929: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:40.929: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:41.007: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:41.007: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:41.086: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:41.086: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:41.165: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:41.165: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:41.243: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:41.243: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:41.322: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:41.322: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:41.400: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:41.400: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:41.400: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:41.400: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:41.400: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:41.400: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:41.400: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:41.400: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:41.400: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:41.400: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:41.400: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:41.400: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:41.400: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:41.400: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:41.400: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:41.400: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:42.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:42.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:42.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:42.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:42.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:42.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:42.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:42.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:42.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:42.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:42.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:42.696: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:42.696: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:42.778: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:42.778: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:42.856: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:42.856: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:42.935: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:42.935: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:43.013: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:43.013: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:43.093: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:43.093: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:43.171: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:43.171: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:43.249: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:43.249: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:43.328: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:43.328: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:43.406: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:43.407: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:43.407: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:43.407: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:43.407: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:43.407: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:43.407: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:43.407: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:43.407: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:43.407: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:43.407: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:43.407: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:43.407: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:43.407: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:43.407: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:43.407: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:44.216: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:44.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:44.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:44.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:44.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:44.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:44.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:44.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:44.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:44.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:44.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:44.686: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:44.686: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:44.764: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:44.764: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:44.843: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:44.843: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:44.922: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:44.922: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:45.000: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:45.000: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:45.079: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:45.079: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:45.157: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:45.157: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:45.235: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:45.235: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:45.314: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:45.314: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:45.393: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:45.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:45.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:45.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:45.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:45.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:45.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:45.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:45.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:45.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:45.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:45.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:45.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:45.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:45.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:45.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:46.216: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:46.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:46.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:46.373: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:46.373: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:46.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:46.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:46.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:46.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:46.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:46.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:46.686: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:46.686: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:46.780: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:46.780: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:46.859: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:46.859: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:46.937: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:46.937: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:47.016: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:47.016: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:47.094: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:47.094: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:47.173: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:47.173: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:47.251: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:47.251: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:47.329: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:47.329: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:47.408: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:47.408: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:47.408: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:47.408: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:47.408: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:47.408: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:47.408: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:47.408: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:47.408: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:47.408: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:47.408: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:47.408: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:47.408: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:47.408: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:47.408: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:47.408: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:48.216: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:48.295: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:48.295: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:48.373: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:48.373: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:48.452: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:48.452: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:48.530: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:48.530: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:48.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:48.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:48.687: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:48.687: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:48.765: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:48.765: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:48.843: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:48.843: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:48.921: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:48.922: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:49.000: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:49.000: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:49.079: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:49.079: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:49.157: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:49.157: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:49.236: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:49.236: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:49.314: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:49.314: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:49.392: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:49.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:49.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:49.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:49.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:49.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:49.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:49.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:49.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:49.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:49.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:49.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:49.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:49.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:49.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:49.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:50.216: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:50.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:50.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:50.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:50.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:50.450: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:50.450: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:50.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:50.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:50.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:50.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:50.686: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:50.686: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:50.764: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:50.764: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:50.843: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:50.843: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:50.922: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:50.922: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:51.000: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:51.000: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:51.079: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:51.079: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:51.157: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:51.157: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:51.236: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:51.236: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:51.314: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:51.314: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:51.392: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:51.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:51.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:51.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:51.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:51.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:51.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:51.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:51.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:51.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:51.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:51.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:51.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:51.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:51.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:51.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:52.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:52.295: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:52.295: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:52.374: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:52.374: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:52.453: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:52.453: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:52.532: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:52.532: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:52.611: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:52.611: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:52.689: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:52.689: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:52.768: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:52.768: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:52.846: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:52.846: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:52.925: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:52.925: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:53.003: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:53.003: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:53.085: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:53.085: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:53.163: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:53.163: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:53.244: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:53.244: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:53.323: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:53.323: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:53.401: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:53.401: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:53.401: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:53.401: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:53.401: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:53.401: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:53.401: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:53.401: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:53.401: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:53.401: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:53.401: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:53.401: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:53.401: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:53.401: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:53.401: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:53.401: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:54.216: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:54.295: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:54.295: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:54.374: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:54.374: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:54.452: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:54.452: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:54.531: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:54.531: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:54.609: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:54.609: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:54.688: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:54.688: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:54.766: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:54.766: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:54.844: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:54.844: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:54.923: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:54.923: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:55.001: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:55.001: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:55.080: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:55.080: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:55.158: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:55.158: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:55.236: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:55.236: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:55.315: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:55.315: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:55.393: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:55.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:55.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:55.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:55.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:55.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:55.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:55.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:55.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:55.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:55.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:55.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:55.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:55.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:55.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:55.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:56.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:56.293: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:56.293: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:56.371: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:56.371: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:56.450: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:56.450: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:56.528: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:56.528: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:56.606: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:56.606: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:56.685: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:56.685: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:56.763: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:56.763: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:56.841: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:56.841: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:56.919: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:56.919: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:56.998: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:56.998: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:57.076: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:57.076: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:57.155: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:57.155: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:57.234: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:57.234: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:57.312: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:57.312: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:57.390: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:57.390: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:57.390: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:57.390: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:57.390: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:57.390: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:57.390: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:57.390: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:57.390: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:57.390: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:57.390: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:57.390: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:57.390: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:57.390: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:57.390: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:57.390: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:58.216: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:58.295: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:58.295: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:58.373: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:58.373: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:58.452: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:58.452: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:58.531: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:58.531: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:58.609: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:58.609: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:58.687: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:58.687: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:58.766: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:58.766: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:58.844: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:58.844: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:58.922: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:58.922: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:59.001: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:59.001: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:59.079: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:59.079: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:59.158: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:59.158: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:59.236: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:59.236: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:59.314: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:59.314: INFO: Poking "http://34.127.8.109:80" Nov 26 18:48:59.392: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:48:59.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:59.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:59.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:59.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:59.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:59.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:59.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:59.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:59.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:59.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:59.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:59.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:59.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:59.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:48:59.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:00.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:00.293: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:00.293: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:00.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:00.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:00.450: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:00.450: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:00.528: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:00.528: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:00.607: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:00.607: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:00.691: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:00.691: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:00.769: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:00.769: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:00.848: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:00.848: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:00.927: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:00.927: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:01.005: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:01.005: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:01.084: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:01.084: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:01.162: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:01.162: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:01.240: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:01.241: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:01.319: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:01.319: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:01.398: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:01.398: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:01.398: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:01.398: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:01.398: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:01.398: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:01.398: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:01.398: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:01.398: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:01.398: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:01.398: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:01.398: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:01.398: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:01.398: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:01.398: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:01.398: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:02.216: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:02.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:02.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:02.373: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:02.373: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:02.452: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:02.452: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:02.530: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:02.530: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:02.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:02.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:02.688: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:02.688: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:02.768: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:02.768: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:02.846: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:02.846: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:02.925: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:02.925: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:03.003: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:03.003: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:03.082: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:03.082: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:03.160: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:03.160: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:03.239: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:03.239: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:03.317: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:03.317: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:03.396: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:03.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:03.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:03.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:03.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:03.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:03.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:03.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:03.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:03.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:03.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:03.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:03.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:03.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:03.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:03.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:04.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:04.297: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:04.297: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:04.377: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:04.377: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:04.456: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:04.456: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:04.534: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:04.534: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:04.612: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:04.612: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:04.691: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:04.691: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:04.769: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:04.769: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:04.849: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:04.849: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:04.927: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:04.927: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:05.006: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:05.006: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:05.090: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:05.090: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:05.168: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:05.168: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:05.247: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:05.247: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:05.325: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:05.325: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:05.403: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:05.403: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:05.403: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:05.403: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:05.403: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:05.403: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:05.403: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:05.403: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:05.403: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:05.403: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:05.403: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:05.403: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:05.403: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:05.403: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:05.403: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:05.403: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:06.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:06.293: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:06.293: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:06.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:06.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:06.450: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:06.450: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:06.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:06.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:06.607: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:06.607: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:06.688: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:06.688: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:06.772: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:06.772: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:06.851: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:06.851: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:06.930: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:06.930: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:07.009: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:07.009: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:07.087: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:07.087: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:07.165: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:07.165: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:07.243: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:07.244: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:07.331: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:07.331: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:07.410: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:07.411: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:07.411: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:07.411: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:07.411: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:07.411: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:07.411: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:07.411: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:07.411: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:07.411: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:07.411: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:07.411: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:07.411: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:07.411: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:07.411: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:07.411: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:08.216: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:08.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:08.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:08.373: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:08.373: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:08.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:08.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:08.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:08.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:08.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:08.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:08.691: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:08.691: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:08.769: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:08.769: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:08.848: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:08.848: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:08.926: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:08.926: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:09.004: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:09.005: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:09.083: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:09.083: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:09.161: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:09.161: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:09.240: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:09.240: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:09.318: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:09.318: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:09.396: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:09.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:09.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:09.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:09.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:09.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:09.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:09.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:09.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:09.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:09.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:09.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:09.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:09.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:09.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:09.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:10.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:10.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:10.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:10.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:10.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:10.450: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:10.450: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:10.528: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:10.528: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:10.607: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:10.607: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:10.685: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:10.685: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:10.763: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:10.763: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:10.842: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:10.842: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:10.920: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:10.920: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:10.998: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:10.999: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:11.077: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:11.077: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:11.156: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:11.156: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:11.234: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:11.234: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:11.313: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:11.313: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:11.391: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:11.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:11.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:11.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:11.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:11.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:11.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:11.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:11.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:11.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:11.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:11.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:11.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:11.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:11.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:11.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:12.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:12.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:12.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:12.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:12.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:12.450: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:12.450: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:12.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:12.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:12.607: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:12.607: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:12.686: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:12.686: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:12.764: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:12.764: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:12.842: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:12.842: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:12.920: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:12.920: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:12.999: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:12.999: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:13.077: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:13.077: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:13.156: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:13.156: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:13.234: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:13.234: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:13.312: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:13.312: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:13.391: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:13.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:13.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:13.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:13.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:13.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:13.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:13.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:13.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:13.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:13.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:13.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:13.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:13.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:13.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:13.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:14.216: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:14.295: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:14.295: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:14.373: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:14.373: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:14.452: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:14.452: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:14.530: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:14.530: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:14.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:14.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:14.686: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:14.686: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:14.765: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:14.765: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:14.843: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:14.843: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:14.921: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:14.921: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:15.000: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:15.000: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:15.078: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:15.078: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:15.157: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:15.157: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:15.235: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:15.235: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:15.314: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:15.314: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:15.392: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:15.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:15.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:15.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:15.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:15.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:15.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:15.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:15.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:15.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:15.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:15.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:15.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:15.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:15.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:15.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:16.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:16.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:16.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:16.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:16.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:16.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:16.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:16.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:16.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:16.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:16.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:16.686: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:16.686: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:16.765: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:16.765: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:16.843: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:16.843: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:16.922: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:16.922: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:17.000: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:17.000: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:17.079: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:17.079: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:17.157: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:17.157: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:17.235: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:17.235: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:17.314: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:17.314: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:17.392: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:17.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:17.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:17.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:17.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:17.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:17.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:17.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:17.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:17.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:17.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:17.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:17.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:17.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:17.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:17.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:18.216: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:18.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:18.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:18.373: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:18.373: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:18.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:18.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:18.530: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:18.530: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:18.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:18.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:18.687: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:18.687: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:18.765: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:18.765: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:18.844: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:18.844: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:18.923: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:18.923: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:19.001: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:19.001: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:19.080: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:19.080: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:19.158: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:19.158: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:19.236: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:19.236: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:19.314: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:19.314: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:19.393: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:19.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:19.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:19.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:19.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:19.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:19.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:19.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:19.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:19.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:19.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:19.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:19.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:19.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:19.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:19.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:20.216: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:20.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:20.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:20.373: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:20.373: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:20.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:20.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:20.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:20.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:20.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:20.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:20.686: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:20.686: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:20.765: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:20.765: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:20.843: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:20.843: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:20.921: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:20.921: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:21.000: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:21.000: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:21.078: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:21.078: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:21.157: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:21.157: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:21.236: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:21.236: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:21.314: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:21.314: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:21.393: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:21.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:21.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:21.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:21.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:21.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:21.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:21.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:21.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:21.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:21.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:21.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:21.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:21.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:21.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:21.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:22.216: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:22.295: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:22.295: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:22.373: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:22.373: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:22.452: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:22.452: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:22.530: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:22.530: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:22.609: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:22.609: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:22.687: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:22.687: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:22.765: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:22.765: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:22.843: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:22.843: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:22.921: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:22.921: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:23.000: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:23.000: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:23.078: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:23.078: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:23.157: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:23.157: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:23.235: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:23.235: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:23.313: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:23.313: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:23.392: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:23.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:23.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:23.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:23.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:23.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:23.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:23.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:23.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:23.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:23.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:23.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:23.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:23.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:23.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:23.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:24.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:24.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:24.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:24.376: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:24.376: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:24.455: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:24.455: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:24.533: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:24.533: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:24.612: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:24.612: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:24.691: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:24.691: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:24.769: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:24.770: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:24.848: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:24.848: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:24.926: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:24.926: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:25.005: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:25.005: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:25.084: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:25.084: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:25.163: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:25.163: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:25.241: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:25.241: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:25.320: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:25.320: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:25.399: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:25.399: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:25.399: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:25.399: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:25.399: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:25.399: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:25.399: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:25.399: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:25.399: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:25.399: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:25.399: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:25.399: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:25.399: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:25.399: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:25.399: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:25.399: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:26.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:26.293: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:26.293: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:26.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:26.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:26.450: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:26.450: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:26.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:26.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:26.607: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:26.607: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:26.685: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:26.686: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:26.764: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:26.764: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:26.843: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:26.843: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:26.921: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:26.921: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:26.999: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:26.999: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:27.078: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:27.078: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:27.156: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:27.156: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:27.234: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:27.234: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:27.313: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:27.313: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:27.391: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:27.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:27.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:27.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:27.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:27.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:27.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:27.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:27.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:27.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:27.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:27.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:27.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:27.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:27.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:27.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:28.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:28.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:28.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:28.373: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:28.373: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:28.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:28.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:28.530: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:28.530: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:28.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:28.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:28.692: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:28.692: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:28.770: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:28.770: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:28.849: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:28.849: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:28.928: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:28.928: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:29.010: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:29.010: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:29.090: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:29.090: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:29.168: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:29.168: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:29.246: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:29.246: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:29.324: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:29.325: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:29.403: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:29.403: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:29.403: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:29.403: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:29.403: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:29.403: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:29.403: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:29.403: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:29.403: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:29.403: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:29.403: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:29.403: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:29.403: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:29.403: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:29.403: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:29.403: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:30.216: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:30.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:30.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:30.373: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:30.373: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:30.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:30.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:30.533: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:30.533: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:30.611: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:30.611: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:30.690: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:30.690: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:30.769: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:30.769: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:30.847: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:30.847: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:30.925: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:30.925: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:31.004: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:31.004: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:31.082: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:31.082: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:31.160: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:31.161: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:31.239: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:31.239: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:31.317: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:31.317: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:31.396: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:31.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:31.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:31.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:31.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:31.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:31.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:31.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:31.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:31.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:31.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:31.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:31.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:31.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:31.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:31.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:32.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:32.293: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:32.293: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:32.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:32.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:32.450: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:32.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:32.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:32.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:32.607: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:32.607: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:32.685: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:32.685: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:32.764: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:32.764: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:32.842: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:32.842: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:32.920: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:32.920: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:32.998: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:32.998: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:33.077: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:33.077: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:33.155: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:33.156: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:33.234: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:33.234: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:33.313: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:33.313: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:33.391: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:33.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:33.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:33.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:33.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:33.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:33.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:33.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:33.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:33.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:33.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:33.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:33.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:33.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:33.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:33.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:34.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:34.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:34.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:34.373: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:34.373: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:34.452: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:34.452: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:34.530: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:34.530: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:34.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:34.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:34.687: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:34.687: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:34.766: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:34.766: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:34.845: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:34.845: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:34.923: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:34.923: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:35.001: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:35.001: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:35.080: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:35.080: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:35.158: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:35.158: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:35.237: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:35.237: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:35.315: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:35.315: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:35.393: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:35.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:35.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:35.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:35.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:35.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:35.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:35.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:35.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:35.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:35.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:35.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:35.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:35.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:35.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:35.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:36.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:36.297: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:36.297: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:36.376: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:36.376: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:36.454: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:36.454: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:36.532: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:36.532: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:36.611: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:36.611: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:36.690: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:36.690: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:36.768: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:36.768: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:36.847: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:36.847: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:36.925: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:36.925: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:37.004: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:37.004: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:37.082: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:37.082: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:37.161: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:37.161: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:37.239: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:37.239: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:37.317: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:37.317: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:37.396: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:37.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:37.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:37.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:37.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:37.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:37.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:37.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:37.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:37.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:37.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:37.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:37.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:37.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:37.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:37.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:38.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:38.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:38.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:38.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:38.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:38.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:38.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:38.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:38.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:38.607: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:38.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:38.686: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:38.686: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:38.764: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:38.764: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:38.843: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:38.843: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:38.944: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:38.944: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:39.023: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:39.023: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:39.101: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:39.101: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:39.180: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:39.180: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:39.258: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:39.258: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:39.337: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:39.337: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:39.415: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:39.415: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:39.415: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:39.415: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:39.415: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:39.415: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:39.415: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:39.415: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:39.415: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:39.415: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:39.415: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:39.415: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:39.415: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:39.415: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:39.415: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:39.415: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:40.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:40.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:40.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:40.373: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:40.373: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:40.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:40.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:40.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:40.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:40.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:40.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:40.687: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:40.687: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:40.766: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:40.766: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:40.844: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:40.844: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:40.923: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:40.923: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:41.002: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:41.003: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:41.081: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:41.081: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:41.159: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:41.159: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:41.238: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:41.238: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:41.316: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:41.316: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:41.394: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:41.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:41.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:41.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:41.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:41.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:41.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:41.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:41.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:41.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:41.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:41.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:41.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:41.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:41.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:41.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:42.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:42.293: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:42.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:42.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:42.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:42.450: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:42.450: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:42.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:42.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:42.607: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:42.607: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:42.686: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:42.686: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:42.765: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:42.765: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:42.843: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:42.843: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:42.922: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:42.922: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:43.000: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:43.000: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:43.078: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:43.078: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:43.157: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:43.157: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:43.236: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:43.236: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:43.314: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:43.314: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:43.393: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:43.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:43.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:43.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:43.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:43.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:43.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:43.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:43.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:43.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:43.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:43.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:43.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:43.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:43.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:43.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:44.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:44.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:44.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:44.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:44.373: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:44.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:44.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:44.530: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:44.530: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:44.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:44.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:44.687: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:44.687: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:44.765: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:44.765: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:44.843: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:44.843: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:44.922: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:44.922: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:45.000: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:45.000: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:45.079: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:45.079: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:45.157: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:45.157: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:45.235: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:45.236: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:45.314: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:45.314: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:45.392: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:45.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:45.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:45.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:45.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:45.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:45.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:45.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:45.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:45.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:45.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:45.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:45.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:45.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:45.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:45.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:46.216: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:46.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:46.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:46.373: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:46.373: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:46.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:46.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:46.530: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:46.530: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:46.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:46.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:46.686: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:46.687: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:46.765: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:46.765: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:46.844: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:46.844: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:46.922: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:46.922: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:47.000: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:47.000: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:47.079: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:47.079: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:47.158: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:47.158: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:47.237: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:47.237: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:47.315: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:47.315: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:47.394: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:47.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:47.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:47.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:47.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:47.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:47.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:47.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:47.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:47.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:47.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:47.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:47.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:47.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:47.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:47.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:48.217: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:48.295: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:48.295: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:48.374: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:48.374: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:48.452: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:48.452: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:48.530: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:48.530: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:48.609: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:48.609: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:48.687: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:48.687: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:48.765: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:48.766: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:48.844: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:48.844: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:48.922: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:48.922: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:49.001: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:49.001: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:49.079: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:49.079: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:49.158: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:49.158: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:49.237: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:49.237: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:49.316: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:49.316: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:49.394: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:49.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:49.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:49.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:49.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:49.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:49.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:49.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:49.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:49.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:49.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:49.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:49.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:49.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:49.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:49.394: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:50.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:50.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:50.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:50.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:50.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:50.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:50.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:50.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:50.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:50.607: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:50.607: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:50.686: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:50.686: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:50.764: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:50.764: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:50.843: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:50.843: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:50.921: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:50.921: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:50.999: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:50.999: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:51.078: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:51.078: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:51.156: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:51.156: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:51.235: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:51.235: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:51.313: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:51.313: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:51.391: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:51.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:51.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:51.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:51.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:51.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:51.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:51.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:51.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:51.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:51.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:51.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:51.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:51.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:51.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:51.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:52.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:52.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:52.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:52.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:52.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:52.450: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:52.450: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:52.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:52.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:52.607: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:52.607: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:52.685: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:52.685: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:52.764: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:52.764: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:52.842: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:52.842: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:52.920: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:52.920: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:52.999: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:52.999: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:53.077: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:53.077: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:53.156: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:53.156: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:53.234: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:53.234: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:53.312: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:53.312: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:53.391: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:53.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:53.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:53.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:53.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:53.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:53.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:53.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:53.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:53.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:53.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:53.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:53.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:53.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:53.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:53.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:54.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:54.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:54.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:54.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:54.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:54.450: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:54.450: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:54.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:54.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:54.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:54.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:54.687: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:54.687: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:54.765: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:54.765: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:54.843: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:54.843: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:54.922: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:54.922: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:55.000: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:55.000: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:55.079: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:55.079: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:55.157: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:55.157: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:55.236: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:55.236: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:55.314: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:55.314: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:55.392: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:55.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:55.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:55.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:55.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:55.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:55.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:55.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:55.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:55.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:55.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:55.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:55.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:55.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:55.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:55.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:56.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:56.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:56.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:56.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:56.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:56.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:56.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:56.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:56.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:56.607: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:56.607: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:56.685: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:56.685: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:56.764: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:56.764: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:56.842: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:56.842: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:56.921: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:56.921: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:56.999: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:57.000: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:57.078: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:57.078: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:57.156: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:57.156: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:57.234: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:57.234: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:57.312: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:57.312: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:57.391: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:57.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:57.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:57.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:57.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:57.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:57.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:57.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:57.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:57.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:57.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:57.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:57.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:57.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:57.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:57.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:58.216: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:58.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:58.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:58.373: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:58.373: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:58.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:58.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:58.530: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:58.530: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:58.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:58.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:58.686: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:58.686: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:58.765: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:58.765: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:58.843: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:58.843: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:58.922: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:58.922: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:59.000: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:59.000: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:59.078: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:59.078: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:59.156: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:59.156: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:59.235: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:59.235: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:59.313: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:59.313: INFO: Poking "http://34.127.8.109:80" Nov 26 18:49:59.391: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:49:59.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:59.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:59.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:59.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:59.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:59.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:59.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:59.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:59.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:59.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:59.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:59.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:59.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:59.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:49:59.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:00.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:00.293: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:00.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:00.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:00.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:00.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:00.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:00.530: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:00.530: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:00.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:00.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:00.686: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:00.686: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:00.765: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:00.765: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:00.843: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:00.843: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:00.921: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:00.921: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:00.999: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:00.999: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:01.077: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:01.078: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:01.156: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:01.156: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:01.234: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:01.234: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:01.313: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:01.313: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:01.391: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:01.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:01.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:01.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:01.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:01.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:01.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:01.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:01.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:01.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:01.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:01.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:01.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:01.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:01.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:01.391: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:02.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:02.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:02.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:02.373: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:02.373: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:02.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:02.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:02.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:02.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:02.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:02.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:02.686: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:02.686: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:02.765: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:02.765: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:02.843: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:02.843: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:02.922: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:02.922: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:03.001: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:03.001: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:03.079: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:03.079: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:03.158: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:03.158: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:03.236: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:03.236: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:03.315: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:03.315: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:03.393: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:03.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:03.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:03.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:03.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:03.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:03.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:03.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:03.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:03.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:03.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:03.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:03.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:03.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:03.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:03.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:04.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:04.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:04.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:04.380: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:04.380: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:04.458: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:04.458: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:04.536: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:04.536: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:04.616: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:04.616: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:04.695: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:04.695: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:04.773: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:04.773: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:04.852: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:04.852: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:04.930: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:04.930: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:05.008: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:05.008: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:05.086: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:05.086: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:05.165: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:05.165: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:05.243: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:05.243: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:05.321: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:05.321: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:05.400: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:05.400: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:05.400: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:05.400: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:05.400: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:05.400: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:05.400: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:05.400: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:05.400: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:05.400: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:05.400: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:05.400: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:05.400: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:05.400: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:05.400: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:05.400: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:06.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:06.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:06.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:06.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:06.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:06.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:06.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:06.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:06.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:06.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:06.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:06.686: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:06.686: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:06.765: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:06.765: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:06.843: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:06.843: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:06.921: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:06.921: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:07.000: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:07.000: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:07.078: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:07.078: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:07.157: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:07.157: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:07.235: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:07.235: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:07.314: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:07.314: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:07.393: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:07.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:07.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:07.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:07.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:07.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:07.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:07.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:07.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:07.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:07.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:07.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:07.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:07.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:07.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:07.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:08.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:08.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:08.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:08.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:08.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:08.450: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:08.450: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:08.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:08.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:08.607: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:08.607: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:08.686: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:08.686: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:08.765: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:08.765: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:08.843: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:08.843: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:08.921: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:08.921: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:08.999: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:09.000: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:09.078: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:09.078: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:09.156: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:09.156: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:09.235: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:09.235: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:09.313: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:09.313: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:09.392: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:09.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:09.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:09.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:09.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:09.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:09.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:09.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:09.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:09.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:09.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:09.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:09.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:09.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:09.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:09.392: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:10.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:10.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:10.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:10.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:10.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:10.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:10.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:10.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:10.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:10.607: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:10.607: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:10.685: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:10.685: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:10.764: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:10.764: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:10.842: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:10.842: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:10.921: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:10.921: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:10.999: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:10.999: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:11.078: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:11.078: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:11.157: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:11.157: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:11.238: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:11.238: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:11.317: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:11.317: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:11.396: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:11.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:11.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:11.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:11.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:11.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:11.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:11.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:11.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:11.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:11.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:11.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:11.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:11.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:11.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:11.396: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:12.216: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:12.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:12.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:12.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:12.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:12.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:12.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:12.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:12.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:12.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:12.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:12.687: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:12.687: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:12.765: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:12.765: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:12.844: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:12.844: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:12.922: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:12.922: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:13.001: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:13.001: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:13.079: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:13.079: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:13.157: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:13.157: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:13.236: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:13.236: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:13.314: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:13.314: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:13.393: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:13.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:13.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:13.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:13.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:13.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:13.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:13.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:13.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:13.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:13.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:13.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:13.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:13.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:13.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:13.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:14.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:14.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:14.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:14.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:14.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:14.450: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:14.450: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:14.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:14.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:14.607: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:14.607: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:14.685: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:14.685: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:14.763: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:14.763: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:14.841: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:14.841: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:14.920: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:14.920: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:14.998: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:14.998: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:15.076: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:15.076: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:15.154: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:15.154: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:15.232: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:15.232: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:15.311: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:15.311: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:15.389: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:15.389: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:15.389: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:15.389: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:15.389: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:15.389: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:15.389: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:15.389: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:15.389: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:15.389: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:15.389: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:15.389: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:15.389: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:15.389: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:15.389: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:15.389: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:16.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:16.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:16.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:16.373: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:16.373: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:16.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:16.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:16.530: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:16.530: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:16.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:16.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:16.687: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:16.687: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:16.765: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:16.765: INFO: Poking "http://34.127.8.109:80" ------------------------------ Progress Report for Ginkgo Process #13 Automatically polling progress: [sig-network] LoadBalancers should be able to switch session affinity for LoadBalancer service with ESIPP on [Slow] [LinuxOnly] (Spec Runtime: 5m0.425s) test/e2e/network/loadbalancer.go:780 In [It] (Node Runtime: 5m0.001s) test/e2e/network/loadbalancer.go:780 At [By Step] waiting for loadbalancer for service loadbalancers-5615/affinity-lb-esipp-transition (Step Runtime: 4m53.622s) test/e2e/network/service.go:3980 Spec Goroutine goroutine 1475 [select] net/http.(*persistConn).roundTrip(0xc003b47d40, 0xc003b646c0) /usr/local/go/src/net/http/transport.go:2620 net/http.(*Transport).roundTrip(0xc003d12140, 0xc003b55300) /usr/local/go/src/net/http/transport.go:595 net/http.(*Transport).RoundTrip(0xc003b55300?, 0x7fadc80?) /usr/local/go/src/net/http/roundtrip.go:17 net/http.send(0xc003b55200, {0x7fadc80, 0xc003d12140}, {0x74d54e0?, 0x26b3a01?, 0xae40400?}) /usr/local/go/src/net/http/client.go:251 net/http.(*Client).send(0xc003b5d320, 0xc003b55200, {0x0?, 0x262a61f?, 0xae40400?}) /usr/local/go/src/net/http/client.go:175 net/http.(*Client).do(0xc003b5d320, 0xc003b55200) /usr/local/go/src/net/http/client.go:715 net/http.(*Client).Do(...) /usr/local/go/src/net/http/client.go:581 net/http.(*Client).Get(0x2?, {0xc002a86768?, 0x9?}) /usr/local/go/src/net/http/client.go:479 k8s.io/kubernetes/test/e2e/framework/network.httpGetNoConnectionPoolTimeout({0xc002a86768, 0x16}, 0x77359400) test/e2e/framework/network/utils.go:1065 k8s.io/kubernetes/test/e2e/framework/network.PokeHTTP({0xc0020a2fb0, 0xc}, 0x50, {0x0, 0x0}, 0x0?) test/e2e/framework/network/utils.go:998 > k8s.io/kubernetes/test/e2e/network.affinityCheckFromTest.func1() test/e2e/network/service.go:166 > k8s.io/kubernetes/test/e2e/network.checkAffinity.func1() test/e2e/network/service.go:193 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.ConditionFunc.WithContext.func1({0x2742871, 0x0}) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:222 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.runConditionWithCrashProtectionWithContext({0x7fe0bc8?, 0xc0000820c8?}, 0x262a61f?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:235 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0000820c8}, 0xc000e0ef78, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:662 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0000820c8}, 0x38?, 0x2fd9d05?, 0x18?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0000820c8}, 0xc0027fbd88?, 0xc0027fbd88?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0x801de88?, 0xc0029d6b60?, 0xc0020a2fb0?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/network.checkAffinity({0x801de88?, 0xc0029d6b60?}, 0x0?, {0xc0020a2fb0?, 0x7638a85?}, 0x7f8aa84?, 0x0) test/e2e/network/service.go:192 > k8s.io/kubernetes/test/e2e/network.execAffinityTestForLBServiceWithOptionalTransition(0x7638a85?, {0x801de88, 0xc0029d6b60}, 0xc0004c7180, 0x1) test/e2e/network/service.go:4002 > k8s.io/kubernetes/test/e2e/network.execAffinityTestForLBServiceWithTransition(...) test/e2e/network/service.go:3962 > k8s.io/kubernetes/test/e2e/network.glob..func19.9() test/e2e/network/loadbalancer.go:787 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0x2d591ce, 0xc001992480}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 18:50:16.843: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:16.843: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:16.922: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:16.922: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:17.000: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:17.001: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:17.079: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:17.079: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:17.158: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:17.158: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:17.236: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:17.236: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:17.314: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:17.314: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:17.393: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:17.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:17.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:17.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:17.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:17.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:17.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:17.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:17.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:17.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:17.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:17.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:17.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:17.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:17.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:17.393: INFO: Received response from host: affinity-lb-esipp-transition-m2r6p Nov 26 18:50:18.216: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:18.295: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:18.295: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:18.387: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:18.387: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:18.466: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:18.466: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:18.545: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:18.545: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:18.623: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:18.623: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:18.702: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:18.702: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:18.780: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:18.780: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:18.859: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:18.859: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:18.937: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:18.937: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:19.015: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:19.015: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:19.094: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:19.094: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:19.172: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:19.172: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:19.250: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:19.250: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:19.328: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:19.328: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:19.407: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:19.407: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:19.407: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:19.407: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:19.407: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:19.407: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:19.407: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:19.407: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:19.407: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:19.407: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:19.407: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:19.407: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:19.407: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:19.408: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:19.408: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:19.408: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:20.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:20.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:20.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:20.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:20.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:20.450: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:20.450: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:20.530: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:20.530: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:20.609: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:20.609: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:20.687: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:20.687: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:20.766: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:20.766: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:20.844: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:20.844: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:20.923: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:20.923: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:21.001: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:21.001: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:21.079: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:21.079: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:21.158: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:21.158: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:21.236: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:21.236: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:21.314: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:21.314: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:21.393: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:21.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:21.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:21.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:21.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:21.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:21.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:21.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:21.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:21.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:21.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:21.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:21.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:21.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:21.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:21.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:22.216: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:22.295: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:22.295: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:22.373: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:22.373: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:22.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:22.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:22.530: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:22.530: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:22.609: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:22.609: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:22.687: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:22.687: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:22.765: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:22.765: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:22.843: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:22.843: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:22.922: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:22.922: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:23.000: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:23.000: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:23.079: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:23.079: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:23.157: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:23.157: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:23.236: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:23.236: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:23.314: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:23.314: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:23.392: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:23.392: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:23.392: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:23.392: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:23.392: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:23.392: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:23.392: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:23.392: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:23.392: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:23.392: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:23.392: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:23.392: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:23.392: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:23.392: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:23.392: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:23.392: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:24.216: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:24.295: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:24.295: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:24.373: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:24.373: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:24.452: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:24.452: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:24.530: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:24.530: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:24.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:24.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:24.687: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:24.687: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:24.765: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:24.765: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:24.843: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:24.844: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:24.923: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:24.923: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:25.001: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:25.001: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:25.079: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:25.079: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:25.158: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:25.158: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:25.236: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:25.236: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:25.315: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:25.315: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:25.393: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:25.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:25.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:25.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:25.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:25.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:25.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:25.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:25.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:25.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:25.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:25.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:25.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:25.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:25.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:25.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:26.216: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:26.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:26.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:26.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:26.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:26.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:26.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:26.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:26.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:26.607: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:26.607: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:26.686: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:26.686: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:26.764: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:26.764: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:26.842: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:26.842: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:26.921: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:26.921: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:26.999: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:26.999: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:27.078: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:27.078: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:27.156: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:27.156: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:27.235: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:27.235: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:27.313: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:27.313: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:27.391: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:27.391: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:27.391: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:27.391: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:27.391: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:27.391: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:27.391: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:27.391: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:27.391: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:27.391: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:27.391: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:27.391: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:27.391: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:27.391: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:27.391: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:27.391: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:28.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:28.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:28.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:28.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:28.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:28.450: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:28.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:28.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:28.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:28.607: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:28.607: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:28.686: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:28.686: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:28.764: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:28.764: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:28.842: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:28.842: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:28.921: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:28.921: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:28.999: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:28.999: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:29.078: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:29.078: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:29.156: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:29.156: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:29.235: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:29.235: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:29.313: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:29.313: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:29.392: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:29.392: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:29.392: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:29.392: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:29.392: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:29.392: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:29.392: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:29.392: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:29.392: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:29.392: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:29.392: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:29.392: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:29.392: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:29.392: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:29.392: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:29.392: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:30.216: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:30.299: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:30.299: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:30.378: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:30.378: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:30.457: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:30.457: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:30.536: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:30.536: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:30.614: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:30.614: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:30.693: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:30.693: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:30.771: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:30.771: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:30.850: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:30.850: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:30.928: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:30.928: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:31.006: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:31.006: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:31.084: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:31.084: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:31.162: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:31.162: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:31.241: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:31.241: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:31.319: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:31.319: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:31.398: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:31.398: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:31.398: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:31.398: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:31.398: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:31.398: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:31.398: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:31.398: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:31.398: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:31.398: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:31.398: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:31.398: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:31.398: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:31.398: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:31.398: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:31.398: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:32.215: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:32.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:32.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:32.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:32.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:32.454: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:32.454: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:32.533: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:32.533: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:32.612: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:32.612: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:32.690: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:32.690: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:32.768: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:32.769: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:32.847: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:32.847: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:32.925: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:32.925: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:33.004: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:33.004: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:33.082: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:33.082: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:33.184: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:33.184: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:33.262: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:33.263: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:33.341: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:33.341: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:33.419: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:33.419: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:33.419: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:33.419: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:33.419: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:33.419: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:33.419: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:33.419: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:33.419: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:33.419: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:33.419: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:33.419: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:33.419: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:33.419: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:33.419: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:33.419: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:34.216: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:34.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:34.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:34.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:34.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:34.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:34.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:34.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:34.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:34.608: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:34.608: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:34.686: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:34.686: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:34.765: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:34.765: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:34.844: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:34.844: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:34.922: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:34.922: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:35.001: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:35.001: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:35.079: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:35.079: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:35.157: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:35.157: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:35.236: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:35.236: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:35.314: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:35.314: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:35.393: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:35.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:35.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:35.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:35.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:35.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:35.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:35.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:35.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:35.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:35.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:35.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:35.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:35.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:35.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:35.393: INFO: Received response from host: affinity-lb-esipp-transition-brzbx Nov 26 18:50:36.216: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:36.294: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:36.294: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:36.372: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:36.372: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:36.451: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:36.451: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:36.529: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:36.529: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:36.607: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:36.607: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:36.686: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:36.686: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:36.765: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:36.765: INFO: Poking "http://34.127.8.109:80" ------------------------------ Progress Report for Ginkgo Process #13 Automatically polling progress: [sig-network] LoadBalancers should be able to switch session affinity for LoadBalancer service with ESIPP on [Slow] [LinuxOnly] (Spec Runtime: 5m20.428s) test/e2e/network/loadbalancer.go:780 In [It] (Node Runtime: 5m20.004s) test/e2e/network/loadbalancer.go:780 At [By Step] waiting for loadbalancer for service loadbalancers-5615/affinity-lb-esipp-transition (Step Runtime: 5m13.625s) test/e2e/network/service.go:3980 Spec Goroutine goroutine 1475 [select] net/http.(*persistConn).roundTrip(0xc004020b40, 0xc003eed980) /usr/local/go/src/net/http/transport.go:2620 net/http.(*Transport).roundTrip(0xc0040243c0, 0xc004026400) /usr/local/go/src/net/http/transport.go:595 net/http.(*Transport).RoundTrip(0xc004026400?, 0x7fadc80?) /usr/local/go/src/net/http/roundtrip.go:17 net/http.send(0xc004026300, {0x7fadc80, 0xc0040243c0}, {0x74d54e0?, 0x26b3a01?, 0xae40400?}) /usr/local/go/src/net/http/client.go:251 net/http.(*Client).send(0xc004022a20, 0xc004026300, {0x0?, 0x262a61f?, 0xae40400?}) /usr/local/go/src/net/http/client.go:175 net/http.(*Client).do(0xc004022a20, 0xc004026300) /usr/local/go/src/net/http/client.go:715 net/http.(*Client).Do(...) /usr/local/go/src/net/http/client.go:581 net/http.(*Client).Get(0x2?, {0xc00400e468?, 0x9?}) /usr/local/go/src/net/http/client.go:479 k8s.io/kubernetes/test/e2e/framework/network.httpGetNoConnectionPoolTimeout({0xc00400e468, 0x16}, 0x77359400) test/e2e/framework/network/utils.go:1065 k8s.io/kubernetes/test/e2e/framework/network.PokeHTTP({0xc0020a2fb0, 0xc}, 0x50, {0x0, 0x0}, 0x0?) test/e2e/framework/network/utils.go:998 > k8s.io/kubernetes/test/e2e/network.affinityCheckFromTest.func1() test/e2e/network/service.go:166 > k8s.io/kubernetes/test/e2e/network.checkAffinity.func1() test/e2e/network/service.go:193 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.ConditionFunc.WithContext.func1({0x2742871, 0x0}) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:222 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.runConditionWithCrashProtectionWithContext({0x7fe0bc8?, 0xc0000820c8?}, 0x262a61f?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:235 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.WaitForWithContext({0x7fe0bc8, 0xc0000820c8}, 0xc000e0ef78, 0x2fdb16a?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:662 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.poll({0x7fe0bc8, 0xc0000820c8}, 0x38?, 0x2fd9d05?, 0x18?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:596 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediateWithContext({0x7fe0bc8, 0xc0000820c8}, 0xc0027fbd88?, 0xc0027fbd88?, 0x262a967?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:528 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.PollImmediate(0x801de88?, 0xc0029d6b60?, 0xc0020a2fb0?) vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:514 > k8s.io/kubernetes/test/e2e/network.checkAffinity({0x801de88?, 0xc0029d6b60?}, 0x0?, {0xc0020a2fb0?, 0x7638a85?}, 0x7f8aa84?, 0x0) test/e2e/network/service.go:192 > k8s.io/kubernetes/test/e2e/network.execAffinityTestForLBServiceWithOptionalTransition(0x7638a85?, {0x801de88, 0xc0029d6b60}, 0xc0004c7180, 0x1) test/e2e/network/service.go:4002 > k8s.io/kubernetes/test/e2e/network.execAffinityTestForLBServiceWithTransition(...) test/e2e/network/service.go:3962 > k8s.io/kubernetes/test/e2e/network.glob..func19.9() test/e2e/network/loadbalancer.go:787 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.extractBodyFunction.func3({0x2d591ce, 0xc001992480}) vendor/github.com/onsi/ginkgo/v2/internal/node.go:449 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode.func2() vendor/github.com/onsi/ginkgo/v2/internal/suite.go:750 k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/v2/internal.(*Suite).runNode vendor/github.com/onsi/ginkgo/v2/internal/suite.go:738 ------------------------------ Nov 26 18:50:36.843: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:36.843: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:36.922: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:36.922: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:37.001: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:37.001: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:37.080: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:37.080: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:37.158: INFO: Poke("http://34.127.8.109:80"): success Nov 26 18:50:37.158: INFO: Poking "http://34.127.8.109:80" Nov 26 18:50:37.237: INFO: Poke("http