This job view page is being replaced by Spyglass soon. Check out the new job view.
ResultFAILURE
Tests 1 failed / 2610 succeeded
Started2020-01-14 03:38
Elapsed26m24s
Revisionmaster
links{u'resultstore': {u'url': u'https://source.cloud.google.com/results/invocations/1c334113-95bb-4dcb-a52d-31bdf6acbc89/targets/test'}}
resultstorehttps://source.cloud.google.com/results/invocations/1c334113-95bb-4dcb-a52d-31bdf6acbc89/targets/test

Test Failures


k8s.io/kubernetes/test/integration/client TestDynamicClient 6.67s

go test -v k8s.io/kubernetes/test/integration/client -run TestDynamicClient$
=== RUN   TestDynamicClient
I0114 03:55:58.914462  106232 controller.go:180] Shutting down kubernetes service endpoint reconciler
I0114 03:55:58.914492  106232 controller.go:87] Shutting down OpenAPI AggregationController
I0114 03:55:58.914516  106232 controller.go:123] Shutting down OpenAPI controller
I0114 03:55:58.914536  106232 cluster_authentication_trust_controller.go:463] Shutting down cluster_authentication_trust_controller controller
I0114 03:55:58.914550  106232 crdregistration_controller.go:142] Shutting down crd-autoregister controller
I0114 03:55:58.914569  106232 apiservice_controller.go:106] Shutting down APIServiceRegistrationController
I0114 03:55:58.914583  106232 autoregister_controller.go:164] Shutting down autoregister controller
I0114 03:55:58.914598  106232 apiapproval_controller.go:196] Shutting down KubernetesAPIApprovalPolicyConformantConditionController
I0114 03:55:58.914621  106232 naming_controller.go:300] Shutting down NamingConditionController
I0114 03:55:58.914637  106232 crd_finalizer.go:276] Shutting down CRDFinalizer
I0114 03:55:58.914650  106232 establishing_controller.go:85] Shutting down EstablishingController
I0114 03:55:58.914675  106232 customresource_discovery_controller.go:220] Shutting down DiscoveryController
I0114 03:55:58.914687  106232 nonstructuralschema_controller.go:197] Shutting down NonStructuralSchemaConditionController
I0114 03:55:58.914702  106232 available_controller.go:398] Shutting down AvailableConditionController
I0114 03:55:58.914833  106232 dynamic_cafile_content.go:181] Shutting down request-header::/tmp/kubernetes-kube-apiserver684072301/proxy-ca.crt
I0114 03:55:58.914854  106232 dynamic_cafile_content.go:181] Shutting down client-ca-bundle::/tmp/kubernetes-kube-apiserver684072301/client-ca.crt
I0114 03:55:58.914871  106232 dynamic_cafile_content.go:181] Shutting down request-header::/tmp/kubernetes-kube-apiserver684072301/proxy-ca.crt
I0114 03:55:58.915048  106232 secure_serving.go:222] Stopped listening on 127.0.0.1:40059
I0114 03:55:58.915067  106232 tlsconfig.go:256] Shutting down DynamicServingCertificateController
I0114 03:55:58.915085  106232 dynamic_serving_content.go:144] Shutting down serving-cert::/tmp/kubernetes-kube-apiserver684072301/apiserver.crt::/tmp/kubernetes-kube-apiserver684072301/apiserver.key
I0114 03:55:58.915111  106232 dynamic_cafile_content.go:181] Shutting down client-ca-bundle::/tmp/kubernetes-kube-apiserver684072301/client-ca.crt
I0114 03:55:59.793980  106232 serving.go:307] Generated self-signed cert (/tmp/kubernetes-kube-apiserver501014504/apiserver.crt, /tmp/kubernetes-kube-apiserver501014504/apiserver.key)
I0114 03:55:59.794005  106232 server.go:596] external host was not specified, using 127.0.0.1
W0114 03:55:59.794015  106232 authentication.go:439] AnonymousAuth is not allowed with the AlwaysAllow authorizer. Resetting AnonymousAuth to false. You should use a different authorizer
W0114 03:56:00.286533  106232 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 03:56:00.286648  106232 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 03:56:00.286668  106232 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 03:56:00.286893  106232 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 03:56:00.287968  106232 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 03:56:00.288010  106232 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 03:56:00.288041  106232 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 03:56:00.288068  106232 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 03:56:00.288308  106232 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 03:56:00.288486  106232 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 03:56:00.288531  106232 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 03:56:00.288595  106232 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0114 03:56:00.288617  106232 plugins.go:158] Loaded 9 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,MutatingAdmissionWebhook,RuntimeClass.
I0114 03:56:00.288626  106232 plugins.go:161] Loaded 6 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ValidatingAdmissionWebhook,RuntimeClass,ResourceQuota.
I0114 03:56:00.289815  106232 plugins.go:158] Loaded 9 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,MutatingAdmissionWebhook,RuntimeClass.
I0114 03:56:00.289834  106232 plugins.go:161] Loaded 6 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ValidatingAdmissionWebhook,RuntimeClass,ResourceQuota.
I0114 03:56:00.291015  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.291048  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.292107  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.292135  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W0114 03:56:00.318692  106232 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0114 03:56:00.319704  106232 master.go:264] Using reconciler: lease
I0114 03:56:00.319952  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.319984  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.322003  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.322035  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.322908  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.322938  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.323953  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.323985  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.326323  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.326355  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.327523  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.327557  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.328829  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.328857  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.329881  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.329908  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.331043  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.331076  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.332262  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.332308  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.333529  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.333560  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.334964  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.334999  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.337134  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.337163  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.338911  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.338941  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.340027  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.340056  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.340928  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.340957  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.341694  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.341721  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.343282  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.343312  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.344172  106232 rest.go:113] the default service ipfamily for this cluster is: IPv4
I0114 03:56:00.453520  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.453563  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.455203  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.455235  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.456356  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.456393  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.457339  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.457373  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.458824  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.458853  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.459809  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.459848  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.460810  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.460840  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.462175  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.462206  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.463258  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.463290  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.464282  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.464308  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.466010  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.466040  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.466944  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.466973  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.468748  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.468780  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.469552  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.469581  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.470550  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.470575  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.471357  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.471382  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.472670  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.472802  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.473637  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.473665  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.474397  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.474425  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.475209  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.475231  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.476511  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.476540  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.478019  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.478044  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.478918  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.478951  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.480650  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.480680  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.481571  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.481596  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.482339  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.482367  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.483154  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.483180  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.484404  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.484433  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.485151  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.485184  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.486432  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.486516  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.488850  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.488883  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.489716  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.489740  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.490432  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.490456  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.492114  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.492142  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.493293  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.493323  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.494268  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.494365  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.495383  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.495513  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.496530  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.496555  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.497523  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.497551  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.498402  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.498423  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.500047  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.500077  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.501455  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.501474  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.502257  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.502288  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.503309  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.503341  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.504502  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.504611  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.505499  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.505599  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.506775  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.506804  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.507579  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.507608  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.508872  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.508980  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.509749  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.509780  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.510489  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.510514  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.511614  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.511791  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.512689  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.512725  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.513711  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.513738  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W0114 03:56:00.749017  106232 genericapiserver.go:404] Skipping API discovery.k8s.io/v1alpha1 because it has no resources.
W0114 03:56:00.930721  106232 genericapiserver.go:404] Skipping API apps/v1beta2 because it has no resources.
W0114 03:56:00.930751  106232 genericapiserver.go:404] Skipping API apps/v1beta1 because it has no resources.
I0114 03:56:00.952246  106232 plugins.go:158] Loaded 9 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,MutatingAdmissionWebhook,RuntimeClass.
I0114 03:56:00.952271  106232 plugins.go:161] Loaded 6 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ValidatingAdmissionWebhook,RuntimeClass,ResourceQuota.
W0114 03:56:00.953403  106232 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0114 03:56:00.953586  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.953615  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:56:00.954510  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:00.954536  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W0114 03:56:00.958672  106232 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0114 03:56:00.959365  106232 aggregator.go:182] Skipping APIService creation for flowcontrol.apiserver.k8s.io/v1alpha1
E0114 03:56:01.143961  106232 controller.go:183] an error on the server ("") has prevented the request from succeeding (get endpoints kubernetes)
I0114 03:56:01.286120  106232 client.go:361] parsed scheme: "endpoint"
I0114 03:56:01.286211  106232 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W0114 03:56:03.415527  106232 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.Service ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0114 03:56:03.415552  106232 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.Endpoints ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0114 03:56:03.415604  106232 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.ServiceAccount ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0114 03:56:03.415625  106232 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.ValidatingWebhookConfiguration ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0114 03:56:03.415837  106232 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.LimitRange ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0114 03:56:03.415869  106232 reflector.go:340] k8s.io/apiextensions-apiserver/pkg/client/informers/externalversions/factory.go:117: watch of *v1.CustomResourceDefinition ended with: very short watch: k8s.io/apiextensions-apiserver/pkg/client/informers/externalversions/factory.go:117: Unexpected watch close - watch lasted less than a second and no items received
W0114 03:56:03.415876  106232 reflector.go:340] k8s.io/kube-aggregator/pkg/client/informers/externalversions/factory.go:117: watch of *v1.APIService ended with: very short watch: k8s.io/kube-aggregator/pkg/client/informers/externalversions/factory.go:117: Unexpected watch close - watch lasted less than a second and no items received
W0114 03:56:03.415922  106232 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.ResourceQuota ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0114 03:56:03.415939  106232 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.Pod ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0114 03:56:03.415967  106232 reflector.go:340] k8s.io/kubernetes/pkg/master/controller/clusterauthenticationtrust/cluster_authentication_trust_controller.go:444: watch of *v1.ConfigMap ended with: very short watch: k8s.io/kubernetes/pkg/master/controller/clusterauthenticationtrust/cluster_authentication_trust_controller.go:444: Unexpected watch close - watch lasted less than a second and no items received
W0114 03:56:03.415810  106232 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1beta1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
I0114 03:56:04.364628  106232 dynamic_cafile_content.go:166] Starting request-header::/tmp/kubernetes-kube-apiserver501014504/proxy-ca.crt
I0114 03:56:04.364633  106232 dynamic_cafile_content.go:166] Starting client-ca-bundle::/tmp/kubernetes-kube-apiserver501014504/client-ca.crt
I0114 03:56:04.364925  106232 dynamic_serving_content.go:129] Starting serving-cert::/tmp/kubernetes-kube-apiserver501014504/apiserver.crt::/tmp/kubernetes-kube-apiserver501014504/apiserver.key
I0114 03:56:04.365556  106232 secure_serving.go:178] Serving securely on 127.0.0.1:46579
I0114 03:56:04.365636  106232 tlsconfig.go:241] Starting DynamicServingCertificateController
I0114 03:56:04.365757  106232 autoregister_controller.go:140] Starting autoregister controller
I0114 03:56:04.365814  106232 cache.go:32] Waiting for caches to sync for autoregister controller
I0114 03:56:04.366698  106232 crd_finalizer.go:264] Starting CRDFinalizer
I0114 03:56:04.366730  106232 available_controller.go:386] Starting AvailableConditionController
I0114 03:56:04.366736  106232 cache.go:32] Waiting for caches to sync for AvailableConditionController controller
I0114 03:56:04.366761  106232 controller.go:81] Starting OpenAPI AggregationController
I0114 03:56:04.365658  106232 apiservice_controller.go:94] Starting APIServiceRegistrationController
I0114 03:56:04.367263  106232 cache.go:32] Waiting for caches to sync for APIServiceRegistrationController controller
I0114 03:56:04.367311  106232 crdregistration_controller.go:111] Starting crd-autoregister controller
I0114 03:56:04.367319  106232 shared_informer.go:206] Waiting for caches to sync for crd-autoregister
I0114 03:56:04.371143  106232 controller.go:86] Starting OpenAPI controller
I0114 03:56:04.371175  106232 customresource_discovery_controller.go:209] Starting DiscoveryController
I0114 03:56:04.371194  106232 naming_controller.go:289] Starting NamingConditionController
I0114 03:56:04.371213  106232 establishing_controller.go:74] Starting EstablishingController
I0114 03:56:04.371236  106232 nonstructuralschema_controller.go:185] Starting NonStructuralSchemaConditionController
I0114 03:56:04.371251  106232 apiapproval_controller.go:184] Starting KubernetesAPIApprovalPolicyConformantConditionController
E0114 03:56:04.377765  106232 controller.go:151] Unable to remove old endpoints from kubernetes service: StorageError: key not found, Code: 1, Key: /dd0bf1e3-e28e-4475-985b-7145f1106eab/registry/masterleases/127.0.0.1, ResourceVersion: 0, AdditionalErrorMsg: 
W0114 03:56:04.378563  106232 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0114 03:56:04.378730  106232 cluster_authentication_trust_controller.go:440] Starting cluster_authentication_trust_controller controller
I0114 03:56:04.378740  106232 shared_informer.go:206] Waiting for caches to sync for cluster_authentication_trust_controller
I0114 03:56:04.379131  106232 dynamic_cafile_content.go:166] Starting client-ca-bundle::/tmp/kubernetes-kube-apiserver501014504/client-ca.crt
I0114 03:56:04.379181  106232 dynamic_cafile_content.go:166] Starting request-header::/tmp/kubernetes-kube-apiserver501014504/proxy-ca.crt
E0114 03:56:04.394345  106232 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0114 03:56:04.397725  106232 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0114 03:56:04.405212  106232 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
I0114 03:56:04.465991  106232 cache.go:39] Caches are synced for autoregister controller
I0114 03:56:04.466847  106232 cache.go:39] Caches are synced for AvailableConditionController controller
I0114 03:56:04.467431  106232 shared_informer.go:213] Caches are synced for crd-autoregister 
I0114 03:56:04.467565  106232 cache.go:39] Caches are synced for APIServiceRegistrationController controller
I0114 03:56:04.479024  106232 shared_informer.go:213] Caches are synced for cluster_authentication_trust_controller 
I0114 03:56:05.364601  106232 controller.go:107] OpenAPI AggregationController: Processing item 
I0114 03:56:05.364634  106232 controller.go:130] OpenAPI AggregationController: action for item : Nothing (removed from the queue).
I0114 03:56:05.364646  106232 controller.go:130] OpenAPI AggregationController: action for item k8s_internal_local_delegation_chain_0000000000: Nothing (removed from the queue).
I0114 03:56:05.381112  106232 storage_scheduling.go:133] created PriorityClass system-node-critical with value 2000001000
I0114 03:56:05.389924  106232 storage_scheduling.go:133] created PriorityClass system-cluster-critical with value 2000000000
I0114 03:56:05.389945  106232 storage_scheduling.go:142] all system priority classes are created successfully or already exist.
W0114 03:56:05.425475  106232 lease.go:224] Resetting endpoints for master service "kubernetes" to [127.0.0.1]
E0114 03:56:05.427065  106232 controller.go:222] unable to sync kubernetes service: Endpoints "kubernetes" is invalid: subsets[0].addresses[0].ip: Invalid value: "127.0.0.1": may not be in the loopback range (127.0.0.0/8)
W0114 03:56:05.572917  106232 cacher.go:162] Terminating all watchers from cacher *apiextensions.CustomResourceDefinition
W0114 03:56:05.573387  106232 cacher.go:162] Terminating all watchers from cacher *core.LimitRange
W0114 03:56:05.573626  106232 cacher.go:162] Terminating all watchers from cacher *core.ResourceQuota
W0114 03:56:05.573877  106232 cacher.go:162] Terminating all watchers from cacher *core.Secret
W0114 03:56:05.574337  106232 cacher.go:162] Terminating all watchers from cacher *core.ConfigMap
W0114 03:56:05.574524  106232 cacher.go:162] Terminating all watchers from cacher *core.Namespace
W0114 03:56:05.574707  106232 cacher.go:162] Terminating all watchers from cacher *core.Endpoints
W0114 03:56:05.575071  106232 cacher.go:162] Terminating all watchers from cacher *core.Pod
W0114 03:56:05.575214  106232 cacher.go:162] Terminating all watchers from cacher *core.ServiceAccount
W0114 03:56:05.575395  106232 cacher.go:162] Terminating all watchers from cacher *core.Service
W0114 03:56:05.579593  106232 cacher.go:162] Terminating all watchers from cacher *node.RuntimeClass
W0114 03:56:05.581904  106232 cacher.go:162] Terminating all watchers from cacher *scheduling.PriorityClass
W0114 03:56:05.582536  106232 cacher.go:162] Terminating all watchers from cacher *storage.StorageClass
W0114 03:56:05.583958  106232 cacher.go:162] Terminating all watchers from cacher *admissionregistration.ValidatingWebhookConfiguration
W0114 03:56:05.584152  106232 cacher.go:162] Terminating all watchers from cacher *admissionregistration.MutatingWebhookConfiguration
W0114 03:56:05.585014  106232 cacher.go:162] Terminating all watchers from cacher *apiregistration.APIService
--- FAIL: TestDynamicClient (6.67s)
    testserver.go:181: runtime-config=map[api/all:true]
    testserver.go:182: Starting kube-apiserver on port 46579...
    testserver.go:198: Waiting for /healthz to be ok...
    dynamic_client_test.go:88: unexpected pod in list. wanted &v1.Pod{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"testplk58", GenerateName:"test", Namespace:"default", SelfLink:"/api/v1/namespaces/default/pods/testplk58", UID:"f3218cb3-9d90-4553-a426-bc99f2ef175b", ResourceVersion:"8328", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63714570965, loc:(*time.Location)(0x7541d00)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"client.test", Operation:"Update", APIVersion:"v1", Time:(*v1.Time)(0xc040341f80), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc040341fc0)}}}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"test", Image:"test-image", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"Always", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc03ea10c08), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc039db20c0), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(nil), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration{v1.Toleration{Key:"node.kubernetes.io/not-ready", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc03ea10c30)}, v1.Toleration{Key:"node.kubernetes.io/unreachable", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc03ea10c50)}}, HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(0xc03ea10c58), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(0xc03ea10c5c), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}, Status:v1.PodStatus{Phase:"Pending", Conditions:[]v1.PodCondition(nil), Message:"", Reason:"", NominatedNodeName:"", HostIP:"", PodIP:"", PodIPs:[]v1.PodIP(nil), StartTime:(*v1.Time)(nil), InitContainerStatuses:[]v1.ContainerStatus(nil), ContainerStatuses:[]v1.ContainerStatus(nil), QOSClass:"BestEffort", EphemeralContainerStatuses:[]v1.ContainerStatus(nil)}}, got &v1.Pod{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"testplk58", GenerateName:"test", Namespace:"default", SelfLink:"/api/v1/namespaces/default/pods/testplk58", UID:"f3218cb3-9d90-4553-a426-bc99f2ef175b", ResourceVersion:"8328", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63714570965, loc:(*time.Location)(0x7541d00)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"client.test", Operation:"Update", APIVersion:"v1", Time:(*v1.Time)(0xc0403ed500), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc0403ed4e0)}}}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"test", Image:"test-image", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"Always", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc03eb5c7b8), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc039dd8cc0), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(nil), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration{v1.Toleration{Key:"node.kubernetes.io/not-ready", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc03eb5c800)}, v1.Toleration{Key:"node.kubernetes.io/unreachable", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc03eb5c820)}}, HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(0xc03eb5c798), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(0xc03eb5c779), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}, Status:v1.PodStatus{Phase:"Pending", Conditions:[]v1.PodCondition(nil), Message:"", Reason:"", NominatedNodeName:"", HostIP:"", PodIP:"", PodIPs:[]v1.PodIP(nil), StartTime:(*v1.Time)(nil), InitContainerStatuses:[]v1.ContainerStatus(nil), ContainerStatuses:[]v1.ContainerStatus(nil), QOSClass:"BestEffort", EphemeralContainerStatuses:[]v1.ContainerStatus(nil)}}

				from junit_20200114-035339.xml

Find in mentions in log files | View test history on testgrid


Show 2610 Passed Tests

Show 4 Skipped Tests

Error lines from build-log.txt

... skipping 55 lines ...
Recording: record_command_canary
Running command: record_command_canary

+++ Running case: test-cmd.record_command_canary 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: record_command_canary
/home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh: line 155: bogus-expected-to-fail: command not found
!!! [0114 03:43:16] Call tree:
!!! [0114 03:43:16]  1: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:47 record_command_canary(...)
!!! [0114 03:43:16]  2: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:112 eVal(...)
!!! [0114 03:43:16]  3: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:131 juLog(...)
!!! [0114 03:43:16]  4: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:159 record_command(...)
!!! [0114 03:43:16]  5: hack/make-rules/test-cmd.sh:35 source(...)
+++ exit code: 1
+++ error: 1
+++ [0114 03:43:16] Running kubeadm tests
+++ [0114 03:43:21] Building go targets for linux/amd64:
    cmd/kubeadm
+++ [0114 03:44:08] Running tests without code coverage
{"Time":"2020-01-14T03:45:38.742862945Z","Action":"output","Package":"k8s.io/kubernetes/cmd/kubeadm/test/cmd","Output":"ok  \tk8s.io/kubernetes/cmd/kubeadm/test/cmd\t48.978s\n"}
✓  cmd/kubeadm/test/cmd (48.978s)
... skipping 302 lines ...
+++ [0114 03:47:29] Building kube-controller-manager
+++ [0114 03:47:34] Building go targets for linux/amd64:
    cmd/kube-controller-manager
+++ [0114 03:48:05] Starting controller-manager
Flag --port has been deprecated, see --secure-port instead.
I0114 03:48:06.492057   54689 serving.go:313] Generated self-signed cert in-memory
W0114 03:48:06.952686   54689 authentication.go:409] failed to read in-cluster kubeconfig for delegated authentication: open /var/run/secrets/kubernetes.io/serviceaccount/token: no such file or directory
W0114 03:48:06.952751   54689 authentication.go:267] No authentication-kubeconfig provided in order to lookup client-ca-file in configmap/extension-apiserver-authentication in kube-system, so client certificate authentication won't work.
W0114 03:48:06.952760   54689 authentication.go:291] No authentication-kubeconfig provided in order to lookup requestheader-client-ca-file in configmap/extension-apiserver-authentication in kube-system, so request-header client certificate authentication won't work.
W0114 03:48:06.952774   54689 authorization.go:177] failed to read in-cluster kubeconfig for delegated authorization: open /var/run/secrets/kubernetes.io/serviceaccount/token: no such file or directory
W0114 03:48:06.952798   54689 authorization.go:146] No authorization-kubeconfig provided, so SubjectAccessReview of authorization tokens won't work.
I0114 03:48:06.952818   54689 controllermanager.go:161] Version: v1.18.0-alpha.1.658+61d36e4a43b831
I0114 03:48:06.953850   54689 secure_serving.go:178] Serving securely on [::]:10257
I0114 03:48:06.953997   54689 tlsconfig.go:241] Starting DynamicServingCertificateController
I0114 03:48:06.954321   54689 deprecated_insecure_serving.go:53] Serving insecurely on [::]:10252
I0114 03:48:06.954425   54689 leaderelection.go:242] attempting to acquire leader lease  kube-system/kube-controller-manager...
... skipping 71 lines ...
I0114 03:48:07.477812   54689 shared_informer.go:206] Waiting for caches to sync for resource quota
I0114 03:48:07.477861   54689 resource_quota_monitor.go:303] QuotaMonitor running
I0114 03:48:07.478094   54689 controllermanager.go:533] Started "ttl"
W0114 03:48:07.478116   54689 controllermanager.go:512] "bootstrapsigner" is disabled
I0114 03:48:07.478136   54689 ttl_controller.go:116] Starting TTL controller
I0114 03:48:07.478164   54689 shared_informer.go:206] Waiting for caches to sync for TTL
E0114 03:48:07.478487   54689 core.go:90] Failed to start service controller: WARNING: no cloud provider provided, services of type LoadBalancer will fail
W0114 03:48:07.478510   54689 controllermanager.go:525] Skipping "service"
I0114 03:48:07.478520   54689 core.go:241] Will not configure cloud provider routes for allocate-node-cidrs: false, configure-cloud-routes: true.
W0114 03:48:07.478528   54689 controllermanager.go:525] Skipping "route"
W0114 03:48:07.478834   54689 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 03:48:07.478920   54689 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 03:48:07.478940   54689 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
... skipping 65 lines ...
I0114 03:48:08.006513   54689 taint_manager.go:162] Sending events to api server.
I0114 03:48:08.006689   54689 node_lifecycle_controller.go:520] Controller will reconcile labels.
I0114 03:48:08.006733   54689 controllermanager.go:533] Started "nodelifecycle"
I0114 03:48:08.006845   54689 node_lifecycle_controller.go:554] Starting node controller
I0114 03:48:08.006868   54689 shared_informer.go:206] Waiting for caches to sync for taint
I0114 03:48:08.007008   54689 node_lifecycle_controller.go:77] Sending events to api server
E0114 03:48:08.007036   54689 core.go:231] failed to start cloud node lifecycle controller: no cloud provider provided
W0114 03:48:08.007046   54689 controllermanager.go:525] Skipping "cloud-node-lifecycle"
I0114 03:48:08.008234   54689 controllermanager.go:533] Started "persistentvolume-binder"
I0114 03:48:08.008333   54689 pv_controller_base.go:294] Starting persistent volume controller
I0114 03:48:08.008376   54689 shared_informer.go:206] Waiting for caches to sync for persistent volume
I0114 03:48:08.008641   54689 controllermanager.go:533] Started "endpoint"
I0114 03:48:08.008793   54689 endpoints_controller.go:181] Starting endpoint controller
... skipping 28 lines ...
  "goVersion": "go1.13.5",
  "compiler": "gc",
  "platform": "linux/amd64"
}I0114 03:48:08.292109   54689 shared_informer.go:213] Caches are synced for PV protection 
I0114 03:48:08.315338   54689 shared_informer.go:213] Caches are synced for expand 
I0114 03:48:08.380520   54689 shared_informer.go:213] Caches are synced for ClusterRoleAggregator 
E0114 03:48:08.391420   54689 clusterroleaggregation_controller.go:180] edit failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "edit": the object has been modified; please apply your changes to the latest version and try again
E0114 03:48:08.401861   54689 clusterroleaggregation_controller.go:180] edit failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "edit": the object has been modified; please apply your changes to the latest version and try again
+++ [0114 03:48:08] Testing kubectl version: check client only output matches expected output
I0114 03:48:08.423918   54689 shared_informer.go:213] Caches are synced for service account 
I0114 03:48:08.426342   51249 controller.go:606] quota admission added evaluator for: serviceaccounts
W0114 03:48:08.518207   54689 actual_state_of_world.go:506] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="127.0.0.1" does not exist
I0114 03:48:08.578346   54689 shared_informer.go:213] Caches are synced for TTL 
Successful: the flag '--client' shows correct client info
(BSuccessful: the flag '--client' correctly has no server version info
(B+++ [0114 03:48:08] Testing kubectl version: verify json output
I0114 03:48:08.678076   54689 shared_informer.go:213] Caches are synced for resource quota 
I0114 03:48:08.680218   54689 shared_informer.go:213] Caches are synced for attach detach 
... skipping 76 lines ...
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_RESTMapper_evaluation_tests
+++ [0114 03:48:12] Creating namespace namespace-1578973692-18112
namespace/namespace-1578973692-18112 created
Context "test" modified.
+++ [0114 03:48:12] Testing RESTMapper
+++ [0114 03:48:13] "kubectl get unknownresourcetype" returns error as expected: error: the server doesn't have a resource type "unknownresourcetype"
+++ exit code: 0
NAME                              SHORTNAMES   APIGROUP                       NAMESPACED   KIND
bindings                                                                      true         Binding
componentstatuses                 cs                                          false        ComponentStatus
configmaps                        cm                                          true         ConfigMap
endpoints                         ep                                          true         Endpoints
... skipping 650 lines ...
has:valid-pod
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
has:valid-pod
core.sh:186: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Berror: resource(s) were provided, but no name, label selector, or --all flag specified
core.sh:190: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:194: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Berror: setting 'all' parameter but found a non empty selector. 
core.sh:198: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
core.sh:206: Successful get pods -l'name in (valid-pod)' {{range.items}}{{.metadata.name}}:{{end}}: 
(Bcore.sh:211: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-kubectl-describe-pod\" }}found{{end}}{{end}}:: :
... skipping 12 lines ...
(Bpoddisruptionbudget.policy/test-pdb-2 created
core.sh:245: Successful get pdb/test-pdb-2 --namespace=test-kubectl-describe-pod {{.spec.minAvailable}}: 50%
(Bpoddisruptionbudget.policy/test-pdb-3 created
core.sh:251: Successful get pdb/test-pdb-3 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 2
(Bpoddisruptionbudget.policy/test-pdb-4 created
core.sh:255: Successful get pdb/test-pdb-4 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 50%
(Berror: min-available and max-unavailable cannot be both specified
core.sh:261: Successful get pods --namespace=test-kubectl-describe-pod {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/env-test-pod created
matched TEST_CMD_1
matched <set to the key 'key-1' in secret 'test-secret'>
matched TEST_CMD_2
matched <set to the key 'key-2' of config map 'test-configmap'>
... skipping 188 lines ...
(Bpod/valid-pod patched
core.sh:470: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: changed-with-yaml:
(Bpod/valid-pod patched
core.sh:475: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:3.1:
(Bpod/valid-pod patched
core.sh:491: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
(B+++ [0114 03:48:56] "kubectl patch with resourceVersion 530" returns error as expected: Error from server (Conflict): Operation cannot be fulfilled on pods "valid-pod": the object has been modified; please apply your changes to the latest version and try again
pod "valid-pod" deleted
pod/valid-pod replaced
core.sh:515: Successful get pod valid-pod {{(index .spec.containers 0).name}}: replaced-k8s-serve-hostname
(BSuccessful
message:error: --grace-period must have --force specified
has:\-\-grace-period must have \-\-force specified
Successful
message:error: --timeout must have --force specified
has:\-\-timeout must have \-\-force specified
node/node-v1-test created
W0114 03:48:57.355049   54689 actual_state_of_world.go:506] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="node-v1-test" does not exist
node/node-v1-test replaced
core.sh:552: Successful get node node-v1-test {{.metadata.annotations.a}}: b
(Bnode "node-v1-test" deleted
core.sh:559: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
(Bcore.sh:562: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/serve_hostname:
(BEdit cancelled, no changes made.
... skipping 22 lines ...
spec:
  containers:
  - image: k8s.gcr.io/pause:2.0
    name: kubernetes-pause
has:localonlyvalue
core.sh:585: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Berror: 'name' already has a value (valid-pod), and --overwrite is false
core.sh:589: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Bcore.sh:593: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Bpod/valid-pod labeled
core.sh:597: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod-super-sayan
(Bcore.sh:601: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
... skipping 85 lines ...
+++ Running case: test-cmd.run_kubectl_create_error_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_create_error_tests
+++ [0114 03:49:09] Creating namespace namespace-1578973749-21565
namespace/namespace-1578973749-21565 created
Context "test" modified.
+++ [0114 03:49:09] Testing kubectl create with error
Error: must specify one of -f and -k

Create a resource from a file or from stdin.

 JSON and YAML formats are accepted.

Examples:
... skipping 41 lines ...

Usage:
  kubectl create -f FILENAME [options]

Use "kubectl <command> --help" for more information about a given command.
Use "kubectl options" for a list of global command-line options (applies to all commands).
+++ [0114 03:49:09] "kubectl create with empty string list returns error as expected: error: error validating "hack/testdata/invalid-rc-with-empty-args.yaml": error validating data: ValidationError(ReplicationController.spec.template.spec.containers[0].args): unknown object type "nil" in ReplicationController.spec.template.spec.containers[0].args[0]; if you choose to ignore these errors, turn validation off with --validate=false
kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
+++ exit code: 0
Recording: run_kubectl_apply_tests
Running command: run_kubectl_apply_tests

... skipping 17 lines ...
(Bpod "test-pod" deleted
customresourcedefinition.apiextensions.k8s.io/resources.mygroup.example.com created
I0114 03:49:12.560194   51249 client.go:361] parsed scheme: "endpoint"
I0114 03:49:12.560237   51249 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 03:49:12.564270   51249 controller.go:606] quota admission added evaluator for: resources.mygroup.example.com
kind.mygroup.example.com/myobj serverside-applied (server dry run)
Error from server (NotFound): resources.mygroup.example.com "myobj" not found
customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
+++ exit code: 0
Recording: run_kubectl_run_tests
Running command: run_kubectl_run_tests

+++ Running case: test-cmd.run_kubectl_run_tests 
... skipping 102 lines ...
Context "test" modified.
+++ [0114 03:49:15] Testing kubectl create filter
create.sh:30: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/selector-test-pod created
create.sh:34: Successful get pods selector-test-pod {{.metadata.labels.name}}: selector-test-pod
(BSuccessful
message:Error from server (NotFound): pods "selector-test-pod-dont-apply" not found
has:pods "selector-test-pod-dont-apply" not found
pod "selector-test-pod" deleted
+++ exit code: 0
Recording: run_kubectl_apply_deployments_tests
Running command: run_kubectl_apply_deployments_tests

... skipping 29 lines ...
I0114 03:49:18.725476   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973756-12736", Name:"nginx", UID:"23c6705e-ddf2-46c2-89b1-b7b4999bc0ff", APIVersion:"apps/v1", ResourceVersion:"624", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-8484dd655 to 3
I0114 03:49:18.730877   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973756-12736", Name:"nginx-8484dd655", UID:"50fa130a-1d26-4c24-9bb4-1ebbb6a3fe30", APIVersion:"apps/v1", ResourceVersion:"625", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-8484dd655-q8d9d
I0114 03:49:18.735558   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973756-12736", Name:"nginx-8484dd655", UID:"50fa130a-1d26-4c24-9bb4-1ebbb6a3fe30", APIVersion:"apps/v1", ResourceVersion:"625", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-8484dd655-dvtrw
I0114 03:49:18.736016   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973756-12736", Name:"nginx-8484dd655", UID:"50fa130a-1d26-4c24-9bb4-1ebbb6a3fe30", APIVersion:"apps/v1", ResourceVersion:"625", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-8484dd655-fqrfv
apps.sh:148: Successful get deployment nginx {{.metadata.name}}: nginx
(BSuccessful
message:Error from server (Conflict): error when applying patch:
{"metadata":{"annotations":{"kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1578973756-12736\",\"resourceVersion\":\"99\"},\"spec\":{\"replicas\":3,\"selector\":{\"matchLabels\":{\"name\":\"nginx2\"}},\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx2\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n"},"resourceVersion":"99"},"spec":{"selector":{"matchLabels":{"name":"nginx2"}},"template":{"metadata":{"labels":{"name":"nginx2"}}}}}
to:
Resource: "apps/v1, Resource=deployments", GroupVersionKind: "apps/v1, Kind=Deployment"
Name: "nginx", Namespace: "namespace-1578973756-12736"
for: "hack/testdata/deployment-label-change2.yaml": Operation cannot be fulfilled on deployments.apps "nginx": the object has been modified; please apply your changes to the latest version and try again
has:Error from server (Conflict)
I0114 03:49:23.309837   54689 horizontal.go:353] Horizontal Pod Autoscaler frontend has been deleted in namespace-1578973746-3020
deployment.apps/nginx configured
I0114 03:49:28.298253   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973756-12736", Name:"nginx", UID:"d3882491-8f89-4575-8565-5859608cca21", APIVersion:"apps/v1", ResourceVersion:"666", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-668b6c7744 to 3
I0114 03:49:28.303203   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973756-12736", Name:"nginx-668b6c7744", UID:"a11f39df-dbc7-4cb2-b2ba-dbf9b84d6fa9", APIVersion:"apps/v1", ResourceVersion:"667", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-668b6c7744-zvwpp
I0114 03:49:28.308321   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973756-12736", Name:"nginx-668b6c7744", UID:"a11f39df-dbc7-4cb2-b2ba-dbf9b84d6fa9", APIVersion:"apps/v1", ResourceVersion:"667", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-668b6c7744-qbjf8
I0114 03:49:28.308455   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973756-12736", Name:"nginx-668b6c7744", UID:"a11f39df-dbc7-4cb2-b2ba-dbf9b84d6fa9", APIVersion:"apps/v1", ResourceVersion:"667", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-668b6c7744-fzcdp
... skipping 142 lines ...
+++ [0114 03:49:36] Creating namespace namespace-1578973776-4580
namespace/namespace-1578973776-4580 created
Context "test" modified.
+++ [0114 03:49:36] Testing kubectl get
get.sh:29: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
get.sh:37: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
get.sh:45: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:{
    "apiVersion": "v1",
    "items": [],
... skipping 23 lines ...
has not:No resources found
Successful
message:NAME
has not:No resources found
get.sh:73: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:error: the server doesn't have a resource type "foobar"
has not:No resources found
Successful
message:No resources found in namespace-1578973776-4580 namespace.
has:No resources found
Successful
message:
has not:No resources found
Successful
message:No resources found in namespace-1578973776-4580 namespace.
has:No resources found
get.sh:93: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
Successful
message:Error from server (NotFound): pods "abc" not found
has not:List
Successful
message:I0114 03:49:38.405640   65174 loader.go:375] Config loaded from file:  /tmp/tmp.VHuTjdTtMr/.kube/config
I0114 03:49:38.407050   65174 round_trippers.go:443] GET http://127.0.0.1:8080/version?timeout=32s 200 OK in 0 milliseconds
I0114 03:49:38.435441   65174 round_trippers.go:443] GET http://127.0.0.1:8080/api/v1/namespaces/default/pods 200 OK in 1 milliseconds
I0114 03:49:38.437403   65174 round_trippers.go:443] GET http://127.0.0.1:8080/api/v1/namespaces/default/replicationcontrollers 200 OK in 1 milliseconds
... skipping 479 lines ...
Successful
message:NAME    DATA   AGE
one     0      1s
three   0      1s
two     0      1s
STATUS    REASON          MESSAGE
Failure   InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has not:watch is only supported on individual resources
Successful
message:STATUS    REASON          MESSAGE
Failure   InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has not:watch is only supported on individual resources
+++ [0114 03:49:45] Creating namespace namespace-1578973785-7500
namespace/namespace-1578973785-7500 created
Context "test" modified.
get.sh:153: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/valid-pod created
... skipping 105 lines ...
}
get.sh:158: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(B<no value>Successful
message:valid-pod:
has:valid-pod:
Successful
message:error: error executing jsonpath "{.missing}": Error executing template: missing is not found. Printing more information for debugging the template:
	template was:
		{.missing}
	object given to jsonpath engine was:
		map[string]interface {}{"apiVersion":"v1", "kind":"Pod", "metadata":map[string]interface {}{"creationTimestamp":"2020-01-14T03:49:45Z", "labels":map[string]interface {}{"name":"valid-pod"}, "managedFields":[]interface {}{map[string]interface {}{"apiVersion":"v1", "fieldsType":"FieldsV1", "fieldsV1":map[string]interface {}{"f:metadata":map[string]interface {}{"f:labels":map[string]interface {}{".":map[string]interface {}{}, "f:name":map[string]interface {}{}}}, "f:spec":map[string]interface {}{"f:containers":map[string]interface {}{"k:{\"name\":\"kubernetes-serve-hostname\"}":map[string]interface {}{".":map[string]interface {}{}, "f:image":map[string]interface {}{}, "f:imagePullPolicy":map[string]interface {}{}, "f:name":map[string]interface {}{}, "f:resources":map[string]interface {}{".":map[string]interface {}{}, "f:limits":map[string]interface {}{".":map[string]interface {}{}, "f:cpu":map[string]interface {}{}, "f:memory":map[string]interface {}{}}, "f:requests":map[string]interface {}{".":map[string]interface {}{}, "f:cpu":map[string]interface {}{}, "f:memory":map[string]interface {}{}}}, "f:terminationMessagePath":map[string]interface {}{}, "f:terminationMessagePolicy":map[string]interface {}{}}}, "f:dnsPolicy":map[string]interface {}{}, "f:enableServiceLinks":map[string]interface {}{}, "f:priority":map[string]interface {}{}, "f:restartPolicy":map[string]interface {}{}, "f:schedulerName":map[string]interface {}{}, "f:securityContext":map[string]interface {}{}, "f:terminationGracePeriodSeconds":map[string]interface {}{}}}, "manager":"kubectl", "operation":"Update", "time":"2020-01-14T03:49:45Z"}}, "name":"valid-pod", "namespace":"namespace-1578973785-7500", "resourceVersion":"753", "selfLink":"/api/v1/namespaces/namespace-1578973785-7500/pods/valid-pod", "uid":"a3f2316e-6dfd-41de-ae87-1b9851375377"}, "spec":map[string]interface {}{"containers":[]interface {}{map[string]interface {}{"image":"k8s.gcr.io/serve_hostname", "imagePullPolicy":"Always", "name":"kubernetes-serve-hostname", "resources":map[string]interface {}{"limits":map[string]interface {}{"cpu":"1", "memory":"512Mi"}, "requests":map[string]interface {}{"cpu":"1", "memory":"512Mi"}}, "terminationMessagePath":"/dev/termination-log", "terminationMessagePolicy":"File"}}, "dnsPolicy":"ClusterFirst", "enableServiceLinks":true, "priority":0, "restartPolicy":"Always", "schedulerName":"default-scheduler", "securityContext":map[string]interface {}{}, "terminationGracePeriodSeconds":30}, "status":map[string]interface {}{"phase":"Pending", "qosClass":"Guaranteed"}}
has:missing is not found
error: error executing template "{{.missing}}": template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing"
Successful
message:Error executing template: template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing". Printing more information for debugging the template:
	template was:
		{{.missing}}
	raw data was:
		{"apiVersion":"v1","kind":"Pod","metadata":{"creationTimestamp":"2020-01-14T03:49:45Z","labels":{"name":"valid-pod"},"managedFields":[{"apiVersion":"v1","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:labels":{".":{},"f:name":{}}},"f:spec":{"f:containers":{"k:{\"name\":\"kubernetes-serve-hostname\"}":{".":{},"f:image":{},"f:imagePullPolicy":{},"f:name":{},"f:resources":{".":{},"f:limits":{".":{},"f:cpu":{},"f:memory":{}},"f:requests":{".":{},"f:cpu":{},"f:memory":{}}},"f:terminationMessagePath":{},"f:terminationMessagePolicy":{}}},"f:dnsPolicy":{},"f:enableServiceLinks":{},"f:priority":{},"f:restartPolicy":{},"f:schedulerName":{},"f:securityContext":{},"f:terminationGracePeriodSeconds":{}}},"manager":"kubectl","operation":"Update","time":"2020-01-14T03:49:45Z"}],"name":"valid-pod","namespace":"namespace-1578973785-7500","resourceVersion":"753","selfLink":"/api/v1/namespaces/namespace-1578973785-7500/pods/valid-pod","uid":"a3f2316e-6dfd-41de-ae87-1b9851375377"},"spec":{"containers":[{"image":"k8s.gcr.io/serve_hostname","imagePullPolicy":"Always","name":"kubernetes-serve-hostname","resources":{"limits":{"cpu":"1","memory":"512Mi"},"requests":{"cpu":"1","memory":"512Mi"}},"terminationMessagePath":"/dev/termination-log","terminationMessagePolicy":"File"}],"dnsPolicy":"ClusterFirst","enableServiceLinks":true,"priority":0,"restartPolicy":"Always","schedulerName":"default-scheduler","securityContext":{},"terminationGracePeriodSeconds":30},"status":{"phase":"Pending","qosClass":"Guaranteed"}}
	object given to template engine was:
		map[apiVersion:v1 kind:Pod metadata:map[creationTimestamp:2020-01-14T03:49:45Z labels:map[name:valid-pod] managedFields:[map[apiVersion:v1 fieldsType:FieldsV1 fieldsV1:map[f:metadata:map[f:labels:map[.:map[] f:name:map[]]] f:spec:map[f:containers:map[k:{"name":"kubernetes-serve-hostname"}:map[.:map[] f:image:map[] f:imagePullPolicy:map[] f:name:map[] f:resources:map[.:map[] f:limits:map[.:map[] f:cpu:map[] f:memory:map[]] f:requests:map[.:map[] f:cpu:map[] f:memory:map[]]] f:terminationMessagePath:map[] f:terminationMessagePolicy:map[]]] f:dnsPolicy:map[] f:enableServiceLinks:map[] f:priority:map[] f:restartPolicy:map[] f:schedulerName:map[] f:securityContext:map[] f:terminationGracePeriodSeconds:map[]]] manager:kubectl operation:Update time:2020-01-14T03:49:45Z]] name:valid-pod namespace:namespace-1578973785-7500 resourceVersion:753 selfLink:/api/v1/namespaces/namespace-1578973785-7500/pods/valid-pod uid:a3f2316e-6dfd-41de-ae87-1b9851375377] spec:map[containers:[map[image:k8s.gcr.io/serve_hostname imagePullPolicy:Always name:kubernetes-serve-hostname resources:map[limits:map[cpu:1 memory:512Mi] requests:map[cpu:1 memory:512Mi]] terminationMessagePath:/dev/termination-log terminationMessagePolicy:File]] dnsPolicy:ClusterFirst enableServiceLinks:true priority:0 restartPolicy:Always schedulerName:default-scheduler securityContext:map[] terminationGracePeriodSeconds:30] status:map[phase:Pending qosClass:Guaranteed]]
has:map has no entry for key "missing"
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
STATUS      REASON          MESSAGE
Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has:STATUS
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
STATUS      REASON          MESSAGE
Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has:valid-pod
Successful
message:pod/valid-pod
status/<unknown>
has not:STATUS
Successful
... skipping 82 lines ...
      (Client.Timeout exceeded while reading body)'
    reason: UnexpectedServerResponse
  - message: 'unable to decode an event from the watch stream: net/http: request canceled
      (Client.Timeout exceeded while reading body)'
    reason: ClientWatchDecoding
kind: Status
message: 'an error on the server ("unable to decode an event from the watch stream:
  net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented
  the request from succeeding'
metadata: {}
reason: InternalError
status: Failure
has not:STATUS
... skipping 79 lines ...
      (Client.Timeout exceeded while reading body)'
    reason: UnexpectedServerResponse
  - message: 'unable to decode an event from the watch stream: net/http: request canceled
      (Client.Timeout exceeded while reading body)'
    reason: ClientWatchDecoding
kind: Status
message: 'an error on the server ("unable to decode an event from the watch stream:
  net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented
  the request from succeeding'
metadata: {}
reason: InternalError
status: Failure
has:name: valid-pod
Successful
message:Error from server (NotFound): pods "invalid-pod" not found
has:"invalid-pod" not found
pod "valid-pod" deleted
get.sh:196: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/redis-master created
pod/valid-pod created
Successful
... skipping 35 lines ...
+++ command: run_kubectl_exec_pod_tests
+++ [0114 03:49:51] Creating namespace namespace-1578973791-1582
namespace/namespace-1578973791-1582 created
Context "test" modified.
+++ [0114 03:49:51] Testing kubectl exec POD COMMAND
Successful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
pod/test-pod created
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pods "test-pod" not found
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pod or type/name must be specified
pod "test-pod" deleted
+++ exit code: 0
Recording: run_kubectl_exec_resource_name_tests
Running command: run_kubectl_exec_resource_name_tests

... skipping 2 lines ...
+++ command: run_kubectl_exec_resource_name_tests
+++ [0114 03:49:52] Creating namespace namespace-1578973792-9156
namespace/namespace-1578973792-9156 created
Context "test" modified.
+++ [0114 03:49:52] Testing kubectl exec TYPE/NAME COMMAND
Successful
message:error: the server doesn't have a resource type "foo"
has:error:
Successful
message:Error from server (NotFound): deployments.apps "bar" not found
has:"bar" not found
pod/test-pod created
replicaset.apps/frontend created
I0114 03:49:53.490964   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973792-9156", Name:"frontend", UID:"9327132c-c999-4ea7-a554-0f21c9ba8cd3", APIVersion:"apps/v1", ResourceVersion:"809", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-6bjss
I0114 03:49:53.494969   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973792-9156", Name:"frontend", UID:"9327132c-c999-4ea7-a554-0f21c9ba8cd3", APIVersion:"apps/v1", ResourceVersion:"809", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-8jfhg
I0114 03:49:53.495255   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973792-9156", Name:"frontend", UID:"9327132c-c999-4ea7-a554-0f21c9ba8cd3", APIVersion:"apps/v1", ResourceVersion:"809", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-x5mff
configmap/test-set-env-config created
Successful
message:error: cannot attach to *v1.ConfigMap: selector for *v1.ConfigMap not implemented
has:not implemented
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:not found
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pod or type/name must be specified
Successful
message:Error from server (BadRequest): pod frontend-6bjss does not have a host assigned
has not:not found
Successful
message:Error from server (BadRequest): pod frontend-6bjss does not have a host assigned
has not:pod or type/name must be specified
pod "test-pod" deleted
replicaset.apps "frontend" deleted
configmap "test-set-env-config" deleted
+++ exit code: 0
Recording: run_create_secret_tests
Running command: run_create_secret_tests

+++ Running case: test-cmd.run_create_secret_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_create_secret_tests
Successful
message:Error from server (NotFound): secrets "mysecret" not found
has:secrets "mysecret" not found
Successful
message:Error from server (NotFound): secrets "mysecret" not found
has:secrets "mysecret" not found
Successful
message:user-specified
has:user-specified
Successful
{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","selfLink":"/api/v1/namespaces/default/configmaps/tester-update-cm","uid":"221a0585-63ce-42d8-a012-a1b35edb6b9a","resourceVersion":"831","creationTimestamp":"2020-01-14T03:49:54Z"}}
... skipping 2 lines ...
has:uid
Successful
message:{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","selfLink":"/api/v1/namespaces/default/configmaps/tester-update-cm","uid":"221a0585-63ce-42d8-a012-a1b35edb6b9a","resourceVersion":"832","creationTimestamp":"2020-01-14T03:49:54Z"},"data":{"key1":"config1"}}
has:config1
{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Success","details":{"name":"tester-update-cm","kind":"configmaps","uid":"221a0585-63ce-42d8-a012-a1b35edb6b9a"}}
Successful
message:Error from server (NotFound): configmaps "tester-update-cm" not found
has:configmaps "tester-update-cm" not found
+++ exit code: 0
Recording: run_kubectl_create_kustomization_directory_tests
Running command: run_kubectl_create_kustomization_directory_tests

+++ Running case: test-cmd.run_kubectl_create_kustomization_directory_tests 
... skipping 159 lines ...
valid-pod   0/1     Pending   0          0s
has:valid-pod
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          0s
STATUS      REASON          MESSAGE
Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has:Timeout exceeded while reading body
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
has:valid-pod
Successful
message:error: Invalid timeout value. Timeout must be a single integer in seconds, or an integer followed by a corresponding time unit (e.g. 1s | 2m | 3h)
has:Invalid timeout value
pod "valid-pod" deleted
+++ exit code: 0
Recording: run_crd_tests
Running command: run_crd_tests

... skipping 240 lines ...
foo.company.com/test patched
crd.sh:236: Successful get foos/test {{.patched}}: value1
(Bfoo.company.com/test patched
crd.sh:238: Successful get foos/test {{.patched}}: value2
(Bfoo.company.com/test patched
crd.sh:240: Successful get foos/test {{.patched}}: <no value>
(B+++ [0114 03:50:06] "kubectl patch --local" returns error as expected for CustomResource: error: cannot apply strategic merge patch for company.com/v1, Kind=Foo locally, try --type merge
{
    "apiVersion": "company.com/v1",
    "kind": "Foo",
    "metadata": {
        "annotations": {
            "kubernetes.io/change-cause": "kubectl patch foos/test --server=http://127.0.0.1:8080 --match-server-version=true --patch={\"patched\":null} --type=merge --record=true"
... skipping 300 lines ...
(Bcrd.sh:450: Successful get bars {{range.items}}{{.metadata.name}}:{{end}}: 
(Bnamespace/non-native-resources created
bar.company.com/test created
crd.sh:455: Successful get bars {{len .items}}: 1
(Bnamespace "non-native-resources" deleted
crd.sh:458: Successful get bars {{len .items}}: 0
(BError from server (NotFound): namespaces "non-native-resources" not found
customresourcedefinition.apiextensions.k8s.io "foos.company.com" deleted
customresourcedefinition.apiextensions.k8s.io "bars.company.com" deleted
customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
customresourcedefinition.apiextensions.k8s.io "validfoos.company.com" deleted
+++ exit code: 0
Recording: run_cmd_with_img_tests
... skipping 11 lines ...
I0114 03:50:24.530505   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973824-3659", Name:"test1-6cdffdb5b8", UID:"048d705d-b052-499a-9163-7512c133d301", APIVersion:"apps/v1", ResourceVersion:"997", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test1-6cdffdb5b8-cg8zx
Successful
message:deployment.apps/test1 created
has:deployment.apps/test1 created
deployment.apps "test1" deleted
W0114 03:50:24.733005   51249 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E0114 03:50:24.734327   54689 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: Invalid image name "InvalidImageName": invalid reference format
has:error: Invalid image name "InvalidImageName": invalid reference format
+++ exit code: 0
+++ [0114 03:50:24] Testing recursive resources
+++ [0114 03:50:24] Creating namespace namespace-1578973824-8778
W0114 03:50:24.844689   51249 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E0114 03:50:24.847406   54689 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578973824-8778 created
W0114 03:50:24.965152   51249 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E0114 03:50:24.966378   54689 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
W0114 03:50:25.085028   51249 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E0114 03:50:25.086168   54689 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:206: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:pod/busybox0 created
pod/busybox1 created
error: error validating "hack/testdata/recursive/pod/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
generic-resources.sh:211: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 03:50:25.735836   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:25.848441   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:220: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: busybox:busybox:
(BSuccessful
message:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
E0114 03:50:25.967990   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:227: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 03:50:26.092403   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:231: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
(BSuccessful
message:pod/busybox0 replaced
pod/busybox1 replaced
error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
generic-resources.sh:236: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:Name:         busybox0
Namespace:    namespace-1578973824-8778
Priority:     0
Node:         <none>
... skipping 155 lines ...
Node-Selectors:   <none>
Tolerations:      <none>
Events:           <none>
unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:246: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 03:50:26.736902   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:26.849572   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:250: Successful get pods {{range.items}}{{.metadata.annotations.annotatekey}}:{{end}}: annotatevalue:annotatevalue:
(BSuccessful
message:pod/busybox0 annotated
pod/busybox1 annotated
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
E0114 03:50:26.969113   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:255: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 03:50:27.093381   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:259: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
(BSuccessful
message:error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
generic-resources.sh:265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx created
I0114 03:50:27.613358   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973824-8778", Name:"nginx", UID:"24885e34-c026-46dd-bce7-ad103fe9eb34", APIVersion:"apps/v1", ResourceVersion:"1020", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-f87d999f7 to 3
I0114 03:50:27.619089   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973824-8778", Name:"nginx-f87d999f7", UID:"23be87f3-5e34-41e6-8386-6f3f854649c8", APIVersion:"apps/v1", ResourceVersion:"1021", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-w4jxx
I0114 03:50:27.621167   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973824-8778", Name:"nginx-f87d999f7", UID:"23be87f3-5e34-41e6-8386-6f3f854649c8", APIVersion:"apps/v1", ResourceVersion:"1021", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-5bdrb
I0114 03:50:27.622133   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973824-8778", Name:"nginx-f87d999f7", UID:"23be87f3-5e34-41e6-8386-6f3f854649c8", APIVersion:"apps/v1", ResourceVersion:"1021", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-n4gcs
generic-resources.sh:269: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(BE0114 03:50:27.737933   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:270: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0114 03:50:27.850593   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
E0114 03:50:27.970261   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:274: Successful get deployment nginx {{ .apiVersion }}: apps/v1
(BSuccessful
message:apiVersion: extensions/v1beta1
kind: Deployment
metadata:
  creationTimestamp: null
... skipping 32 lines ...
      restartPolicy: Always
      schedulerName: default-scheduler
      securityContext: {}
      terminationGracePeriodSeconds: 30
status: {}
has:extensions/v1beta1
E0114 03:50:28.094508   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx" deleted
generic-resources.sh:281: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BI0114 03:50:28.397196   54689 namespace_controller.go:185] Namespace has been deleted non-native-resources
generic-resources.sh:285: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:290: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:busybox0:busybox1:
Successful
message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:299: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 03:50:28.739057   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/busybox0 labeled
pod/busybox1 labeled
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
E0114 03:50:28.852042   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:304: Successful get pods {{range.items}}{{.metadata.labels.mylabel}}:{{end}}: myvalue:myvalue:
(BSuccessful
message:pod/busybox0 labeled
pod/busybox1 labeled
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
E0114 03:50:28.971812   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:309: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 03:50:29.095614   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/busybox0 patched
pod/busybox1 patched
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
generic-resources.sh:314: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: prom/busybox:prom/busybox:
(BSuccessful
message:pod/busybox0 patched
pod/busybox1 patched
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:319: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:323: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "busybox0" force deleted
pod "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:328: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:50:29.740268   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/busybox0 created
I0114 03:50:29.851048   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973824-8778", Name:"busybox0", UID:"7158a0ff-1b3d-486e-8c91-341e0afe0958", APIVersion:"v1", ResourceVersion:"1053", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-lzcgx
replicationcontroller/busybox1 created
error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
E0114 03:50:29.853189   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 03:50:29.855851   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973824-8778", Name:"busybox1", UID:"cb0f5686-e937-4e14-8f09-1cf64a08f90d", APIVersion:"v1", ResourceVersion:"1055", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-8dj8w
E0114 03:50:29.973111   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:332: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:337: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 03:50:30.097064   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:338: Successful get rc busybox0 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:339: Successful get rc busybox1 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:344: Successful get hpa busybox0 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
(Bgeneric-resources.sh:345: Successful get hpa busybox1 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
(BSuccessful
message:horizontalpodautoscaler.autoscaling/busybox0 autoscaled
horizontalpodautoscaler.autoscaling/busybox1 autoscaled
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
horizontalpodautoscaler.autoscaling "busybox0" deleted
E0114 03:50:30.741568   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
horizontalpodautoscaler.autoscaling "busybox1" deleted
E0114 03:50:30.854260   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:353: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 03:50:30.974270   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:354: Successful get rc busybox0 {{.spec.replicas}}: 1
(BE0114 03:50:31.098111   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:355: Successful get rc busybox1 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:359: Successful get service busybox0 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(Bgeneric-resources.sh:360: Successful get service busybox1 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(BSuccessful
message:service/busybox0 exposed
service/busybox1 exposed
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
generic-resources.sh:366: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:367: Successful get rc busybox0 {{.spec.replicas}}: 1
(BE0114 03:50:31.742953   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:368: Successful get rc busybox1 {{.spec.replicas}}: 1
(BE0114 03:50:31.855581   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 03:50:31.880925   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973824-8778", Name:"busybox0", UID:"7158a0ff-1b3d-486e-8c91-341e0afe0958", APIVersion:"v1", ResourceVersion:"1075", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-4xh8l
I0114 03:50:31.892029   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973824-8778", Name:"busybox1", UID:"cb0f5686-e937-4e14-8f09-1cf64a08f90d", APIVersion:"v1", ResourceVersion:"1079", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-r789m
E0114 03:50:31.975530   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:372: Successful get rc busybox0 {{.spec.replicas}}: 2
(BE0114 03:50:32.099311   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:373: Successful get rc busybox1 {{.spec.replicas}}: 2
(BSuccessful
message:replicationcontroller/busybox0 scaled
replicationcontroller/busybox1 scaled
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
generic-resources.sh:378: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:382: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
replicationcontroller "busybox0" force deleted
replicationcontroller "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
generic-resources.sh:387: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:50:32.744268   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx1-deployment created
I0114 03:50:32.806399   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973824-8778", Name:"nginx1-deployment", UID:"a832d4dd-617b-4d57-ab08-2442be0b38d4", APIVersion:"apps/v1", ResourceVersion:"1097", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx1-deployment-7bdbbfb5cf to 2
deployment.apps/nginx0-deployment created
error: error validating "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0114 03:50:32.812977   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973824-8778", Name:"nginx0-deployment", UID:"530bc3b2-ab58-49ca-a52c-266e1b803170", APIVersion:"apps/v1", ResourceVersion:"1099", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx0-deployment-57c6bff7f6 to 2
I0114 03:50:32.812977   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973824-8778", Name:"nginx1-deployment-7bdbbfb5cf", UID:"c5b55f7d-d4b0-4592-b07b-b4f5baacdb4c", APIVersion:"apps/v1", ResourceVersion:"1098", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-7bdbbfb5cf-mzttq
I0114 03:50:32.816078   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973824-8778", Name:"nginx1-deployment-7bdbbfb5cf", UID:"c5b55f7d-d4b0-4592-b07b-b4f5baacdb4c", APIVersion:"apps/v1", ResourceVersion:"1098", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-7bdbbfb5cf-cw7bj
I0114 03:50:32.819449   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973824-8778", Name:"nginx0-deployment-57c6bff7f6", UID:"54178ed5-ce25-42d4-b29c-0f877f2f1140", APIVersion:"apps/v1", ResourceVersion:"1103", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-57c6bff7f6-rn7bd
I0114 03:50:32.824333   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973824-8778", Name:"nginx0-deployment-57c6bff7f6", UID:"54178ed5-ce25-42d4-b29c-0f877f2f1140", APIVersion:"apps/v1", ResourceVersion:"1103", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-57c6bff7f6-gkqt9
E0114 03:50:32.856606   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:391: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx0-deployment:nginx1-deployment:
(BE0114 03:50:32.976803   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:392: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
(BE0114 03:50:33.100606   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:396: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
(BSuccessful
message:deployment.apps/nginx1-deployment skipped rollback (current template already matches revision 1)
deployment.apps/nginx0-deployment skipped rollback (current template already matches revision 1)
error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
deployment.apps/nginx1-deployment paused
deployment.apps/nginx0-deployment paused
generic-resources.sh:404: Successful get deployment {{range.items}}{{.spec.paused}}:{{end}}: true:true:
(BSuccessful
message:unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
deployment.apps/nginx1-deployment resumed
deployment.apps/nginx0-deployment resumed
E0114 03:50:33.745476   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:410: Successful get deployment {{range.items}}{{.spec.paused}}:{{end}}: <no value>:<no value>:
(BSuccessful
message:unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
E0114 03:50:33.857762   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:deployment.apps/nginx1-deployment 
REVISION  CHANGE-CAUSE
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:nginx0-deployment
Successful
message:deployment.apps/nginx1-deployment 
REVISION  CHANGE-CAUSE
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:nginx1-deployment
Successful
message:deployment.apps/nginx1-deployment 
REVISION  CHANGE-CAUSE
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
E0114 03:50:33.978228   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
deployment.apps "nginx1-deployment" force deleted
deployment.apps "nginx0-deployment" force deleted
error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
E0114 03:50:34.101725   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:34.746858   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:34.859008   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:34.979513   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:35.102969   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:426: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/busybox0 created
I0114 03:50:35.407396   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973824-8778", Name:"busybox0", UID:"9654bed0-0435-4f62-be6d-045eced14a82", APIVersion:"v1", ResourceVersion:"1147", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-6r76x
replicationcontroller/busybox1 created
error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0114 03:50:35.412336   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973824-8778", Name:"busybox1", UID:"d9fd8bd8-07ca-48c3-81e8-29f5f32afc06", APIVersion:"v1", ResourceVersion:"1149", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-kfljn
generic-resources.sh:430: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:no rollbacker has been implemented for "ReplicationController"
no rollbacker has been implemented for "ReplicationController"
unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:no rollbacker has been implemented for "ReplicationController"
Successful
message:no rollbacker has been implemented for "ReplicationController"
no rollbacker has been implemented for "ReplicationController"
unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
E0114 03:50:35.749114   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:replicationcontrollers "busybox0" pausing is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:replicationcontrollers "busybox1" pausing is not supported
E0114 03:50:35.860442   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:replicationcontrollers "busybox0" resuming is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:replicationcontrollers "busybox1" resuming is not supported
E0114 03:50:35.980965   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
replicationcontroller "busybox0" force deleted
replicationcontroller "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
E0114 03:50:36.104304   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:36.750341   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:36.861634   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:36.982064   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_namespace_tests
Running command: run_namespace_tests

+++ Running case: test-cmd.run_namespace_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_namespace_tests
+++ [0114 03:50:37] Testing kubectl(v1:namespaces)
E0114 03:50:37.105407   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/my-namespace created
core.sh:1314: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
(Bnamespace "my-namespace" deleted
E0114 03:50:37.751880   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:37.862721   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:37.983196   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:38.106401   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:38.753088   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:38.863910   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:38.984346   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:39.107619   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:39.754399   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:39.865191   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:39.985511   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:40.108897   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 03:50:40.686116   54689 shared_informer.go:206] Waiting for caches to sync for resource quota
I0114 03:50:40.686161   54689 shared_informer.go:213] Caches are synced for resource quota 
E0114 03:50:40.755564   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:40.866379   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:40.986584   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:41.109953   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 03:50:41.197882   54689 shared_informer.go:206] Waiting for caches to sync for garbage collector
I0114 03:50:41.197949   54689 shared_informer.go:213] Caches are synced for garbage collector 
E0114 03:50:41.756971   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:41.867581   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:41.987855   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:42.110981   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/my-namespace condition met
Successful
message:Error from server (NotFound): namespaces "my-namespace" not found
has: not found
namespace/my-namespace created
E0114 03:50:42.758129   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1323: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
(BE0114 03:50:42.868731   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:42.988941   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
namespace "kube-node-lease" deleted
namespace "my-namespace" deleted
namespace "namespace-1578973689-8545" deleted
namespace "namespace-1578973692-18112" deleted
... skipping 26 lines ...
namespace "namespace-1578973796-8570" deleted
namespace "namespace-1578973797-15934" deleted
namespace "namespace-1578973800-12295" deleted
namespace "namespace-1578973801-21316" deleted
namespace "namespace-1578973824-3659" deleted
namespace "namespace-1578973824-8778" deleted
Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
has:warning: deleting cluster-scoped resources
Successful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
namespace "kube-node-lease" deleted
namespace "my-namespace" deleted
namespace "namespace-1578973689-8545" deleted
... skipping 27 lines ...
namespace "namespace-1578973796-8570" deleted
namespace "namespace-1578973797-15934" deleted
namespace "namespace-1578973800-12295" deleted
namespace "namespace-1578973801-21316" deleted
namespace "namespace-1578973824-3659" deleted
namespace "namespace-1578973824-8778" deleted
Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
has:namespace "my-namespace" deleted
E0114 03:50:43.112364   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1335: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"other\" }}found{{end}}{{end}}:: :
(Bnamespace/other created
core.sh:1339: Successful get namespaces/other {{.metadata.name}}: other
(Bcore.sh:1343: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/valid-pod created
E0114 03:50:43.759251   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1347: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:1349: Successful get pods -n other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BE0114 03:50:43.869855   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: a resource cannot be retrieved by name across all namespaces
has:a resource cannot be retrieved by name across all namespaces
E0114 03:50:43.990116   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1356: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BE0114 03:50:44.113567   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
core.sh:1360: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
(Bnamespace "other" deleted
E0114 03:50:44.760746   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:44.870984   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:44.991354   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:45.114772   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 03:50:45.380443   54689 horizontal.go:353] Horizontal Pod Autoscaler busybox0 has been deleted in namespace-1578973824-8778
I0114 03:50:45.384449   54689 horizontal.go:353] Horizontal Pod Autoscaler busybox1 has been deleted in namespace-1578973824-8778
E0114 03:50:45.761997   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:45.872294   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:45.992383   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:46.116176   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:46.763149   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:46.873595   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:46.993500   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:47.117228   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:47.764434   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:47.874572   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:47.994399   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:48.120752   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:48.765726   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:48.875592   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:48.995485   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:49.121853   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_secrets_test
Running command: run_secrets_test

+++ Running case: test-cmd.run_secrets_test 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_secrets_test
+++ [0114 03:50:49] Creating namespace namespace-1578973849-20575
namespace/namespace-1578973849-20575 created
E0114 03:50:49.766818   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0114 03:50:49] Testing secrets
I0114 03:50:49.840814   71608 loader.go:375] Config loaded from file:  /tmp/tmp.VHuTjdTtMr/.kube/config
Successful
message:apiVersion: v1
data:
... skipping 27 lines ...
  key1: dmFsdWUx
kind: Secret
metadata:
  creationTimestamp: null
  name: test
has not:example.com
E0114 03:50:49.876877   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:725: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-secrets\" }}found{{end}}{{end}}:: :
(BE0114 03:50:49.996715   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/test-secrets created
E0114 03:50:50.122876   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:729: Successful get namespaces/test-secrets {{.metadata.name}}: test-secrets
(Bcore.sh:733: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(Bsecret/test-secret created
core.sh:737: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(Bcore.sh:738: Successful get secret/test-secret --namespace=test-secrets {{.type}}: test-type
(Bsecret "test-secret" deleted
E0114 03:50:50.768058   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:748: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:50:50.878038   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret/test-secret created
E0114 03:50:50.997923   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:752: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(BE0114 03:50:51.124027   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:753: Successful get secret/test-secret --namespace=test-secrets {{.type}}: kubernetes.io/dockerconfigjson
(Bsecret "test-secret" deleted
core.sh:763: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(Bsecret/test-secret created
core.sh:766: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(Bcore.sh:767: Successful get secret/test-secret --namespace=test-secrets {{.type}}: kubernetes.io/tls
(BE0114 03:50:51.769268   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret "test-secret" deleted
secret/test-secret created
E0114 03:50:51.879295   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:773: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(BE0114 03:50:51.999016   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:774: Successful get secret/test-secret --namespace=test-secrets {{.type}}: kubernetes.io/tls
(BE0114 03:50:52.125360   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret "test-secret" deleted
secret/secret-string-data created
core.sh:796: Successful get secret/secret-string-data --namespace=test-secrets  {{.data}}: map[k1:djE= k2:djI=]
(Bcore.sh:797: Successful get secret/secret-string-data --namespace=test-secrets  {{.data}}: map[k1:djE= k2:djI=]
(BI0114 03:50:52.564642   54689 namespace_controller.go:185] Namespace has been deleted my-namespace
core.sh:798: Successful get secret/secret-string-data --namespace=test-secrets  {{.stringData}}: <no value>
(Bsecret "secret-string-data" deleted
E0114 03:50:52.770440   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:807: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:50:52.880658   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:53.000279   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret "test-secret" deleted
namespace "test-secrets" deleted
I0114 03:50:53.089605   54689 namespace_controller.go:185] Namespace has been deleted namespace-1578973689-8545
I0114 03:50:53.100054   54689 namespace_controller.go:185] Namespace has been deleted kube-node-lease
I0114 03:50:53.110620   54689 namespace_controller.go:185] Namespace has been deleted namespace-1578973692-18112
E0114 03:50:53.126323   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 03:50:53.128608   54689 namespace_controller.go:185] Namespace has been deleted namespace-1578973709-9052
I0114 03:50:53.128633   54689 namespace_controller.go:185] Namespace has been deleted namespace-1578973716-21910
I0114 03:50:53.133908   54689 namespace_controller.go:185] Namespace has been deleted namespace-1578973713-31593
I0114 03:50:53.144975   54689 namespace_controller.go:185] Namespace has been deleted namespace-1578973709-2140
I0114 03:50:53.145153   54689 namespace_controller.go:185] Namespace has been deleted namespace-1578973698-3423
I0114 03:50:53.170547   54689 namespace_controller.go:185] Namespace has been deleted namespace-1578973714-21371
... skipping 15 lines ...
I0114 03:50:53.648926   54689 namespace_controller.go:185] Namespace has been deleted namespace-1578973785-7500
I0114 03:50:53.653504   54689 namespace_controller.go:185] Namespace has been deleted namespace-1578973791-1582
I0114 03:50:53.670160   54689 namespace_controller.go:185] Namespace has been deleted namespace-1578973796-22314
I0114 03:50:53.679499   54689 namespace_controller.go:185] Namespace has been deleted namespace-1578973792-9156
I0114 03:50:53.689108   54689 namespace_controller.go:185] Namespace has been deleted namespace-1578973756-12736
I0114 03:50:53.710415   54689 namespace_controller.go:185] Namespace has been deleted namespace-1578973796-8570
E0114 03:50:53.771983   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 03:50:53.772587   54689 namespace_controller.go:185] Namespace has been deleted namespace-1578973797-15934
I0114 03:50:53.780798   54689 namespace_controller.go:185] Namespace has been deleted namespace-1578973800-12295
I0114 03:50:53.794118   54689 namespace_controller.go:185] Namespace has been deleted namespace-1578973801-21316
I0114 03:50:53.807615   54689 namespace_controller.go:185] Namespace has been deleted namespace-1578973824-3659
I0114 03:50:53.851577   54689 namespace_controller.go:185] Namespace has been deleted namespace-1578973824-8778
E0114 03:50:53.881970   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:54.001503   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:54.127361   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 03:50:54.486545   54689 namespace_controller.go:185] Namespace has been deleted other
E0114 03:50:54.773221   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:54.883281   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:55.002634   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:55.128570   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:55.774403   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:55.884584   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:56.004245   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:56.129675   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:56.775594   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:56.885915   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:57.005318   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:57.130951   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:57.776690   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:57.887081   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:58.006741   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:50:58.131926   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_configmap_tests
Running command: run_configmap_tests

+++ Running case: test-cmd.run_configmap_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_configmap_tests
+++ [0114 03:50:58] Creating namespace namespace-1578973858-23454
namespace/namespace-1578973858-23454 created
Context "test" modified.
+++ [0114 03:50:58] Testing configmaps
configmap/test-configmap created
E0114 03:50:58.777900   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:28: Successful get configmap/test-configmap {{.metadata.name}}: test-configmap
(BE0114 03:50:58.888579   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
configmap "test-configmap" deleted
E0114 03:50:59.008108   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:33: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-configmaps\" }}found{{end}}{{end}}:: :
(BE0114 03:50:59.135318   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/test-configmaps created
core.sh:37: Successful get namespaces/test-configmaps {{.metadata.name}}: test-configmaps
(Bcore.sh:41: Successful get configmaps {{range.items}}{{ if eq .metadata.name \"test-configmap\" }}found{{end}}{{end}}:: :
(Bcore.sh:42: Successful get configmaps {{range.items}}{{ if eq .metadata.name \"test-binary-configmap\" }}found{{end}}{{end}}:: :
(Bconfigmap/test-configmap created
configmap/test-binary-configmap created
E0114 03:50:59.778900   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:48: Successful get configmap/test-configmap --namespace=test-configmaps {{.metadata.name}}: test-configmap
(BE0114 03:50:59.889824   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:49: Successful get configmap/test-binary-configmap --namespace=test-configmaps {{.metadata.name}}: test-binary-configmap
(BE0114 03:51:00.009212   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:00.136782   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
configmap "test-configmap" deleted
configmap "test-binary-configmap" deleted
namespace "test-configmaps" deleted
E0114 03:51:00.780444   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:00.891005   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:01.010530   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:01.137850   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:01.781689   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:01.892306   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:02.011851   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:02.138985   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:02.782858   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:02.893450   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:03.012997   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:03.140336   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 03:51:03.184865   54689 namespace_controller.go:185] Namespace has been deleted test-secrets
E0114 03:51:03.784337   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:03.894607   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:04.014148   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:04.141386   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:04.785676   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:04.896080   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:05.015372   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:05.142517   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_client_config_tests
Running command: run_client_config_tests

+++ Running case: test-cmd.run_client_config_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_client_config_tests
+++ [0114 03:51:05] Creating namespace namespace-1578973865-21044
namespace/namespace-1578973865-21044 created
E0114 03:51:05.786801   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0114 03:51:05] Testing client config
E0114 03:51:05.897311   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
E0114 03:51:06.016748   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
E0114 03:51:06.143584   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Error in configuration: context was not found for specified context: missing-context
has:context was not found for specified context: missing-context
Successful
message:error: no server found for cluster "missing-cluster"
has:no server found for cluster "missing-cluster"
Successful
message:error: auth info "missing-user" does not exist
has:auth info "missing-user" does not exist
Successful
message:error: error loading config file "/tmp/newconfig.yaml": no kind "Config" is registered for version "v-1" in scheme "k8s.io/client-go/tools/clientcmd/api/latest/latest.go:50"
has:error loading config file
Successful
message:error: stat missing-config: no such file or directory
has:no such file or directory
+++ exit code: 0
Recording: run_service_accounts_tests
Running command: run_service_accounts_tests

+++ Running case: test-cmd.run_service_accounts_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_service_accounts_tests
+++ [0114 03:51:06] Creating namespace namespace-1578973866-22470
namespace/namespace-1578973866-22470 created
E0114 03:51:06.788014   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0114 03:51:06] Testing service accounts
E0114 03:51:06.898454   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:828: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-service-accounts\" }}found{{end}}{{end}}:: :
(Bnamespace/test-service-accounts created
E0114 03:51:07.017679   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:832: Successful get namespaces/test-service-accounts {{.metadata.name}}: test-service-accounts
(BE0114 03:51:07.144894   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
serviceaccount/test-service-account created
core.sh:838: Successful get serviceaccount/test-service-account --namespace=test-service-accounts {{.metadata.name}}: test-service-account
(Bserviceaccount "test-service-account" deleted
namespace "test-service-accounts" deleted
E0114 03:51:07.789360   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:07.900031   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:08.018903   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:08.145732   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:08.790502   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:08.901366   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:09.020578   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:09.146869   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:09.792085   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:09.902555   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:10.021789   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:10.148291   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 03:51:10.488553   54689 namespace_controller.go:185] Namespace has been deleted test-configmaps
E0114 03:51:10.793187   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:10.903934   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:11.022885   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:11.149355   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:11.794343   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:11.905121   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:12.024454   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:12.150518   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_job_tests
Running command: run_job_tests

+++ Running case: test-cmd.run_job_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_job_tests
+++ [0114 03:51:12] Creating namespace namespace-1578973872-4534
E0114 03:51:12.795577   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578973872-4534 created
Context "test" modified.
+++ [0114 03:51:12] Testing job
E0114 03:51:12.906349   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
batch.sh:30: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-jobs\" }}found{{end}}{{end}}:: :
(BE0114 03:51:13.025592   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/test-jobs created
E0114 03:51:13.152056   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
batch.sh:34: Successful get namespaces/test-jobs {{.metadata.name}}: test-jobs
(Bkubectl run --generator=cronjob/v1beta1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
cronjob.batch/pi created
batch.sh:39: Successful get cronjob/pi --namespace=test-jobs {{.metadata.name}}: pi
(BNAME   SCHEDULE       SUSPEND   ACTIVE   LAST SCHEDULE   AGE
pi     59 23 31 2 *   False     0        <none>          0s
... skipping 2 lines ...
Labels:                        run=pi
Annotations:                   <none>
Schedule:                      59 23 31 2 *
Concurrency Policy:            Allow
Suspend:                       False
Successful Job History Limit:  3
Failed Job History Limit:      1
Starting Deadline Seconds:     <unset>
Selector:                      <unset>
Parallelism:                   <unset>
Completions:                   <unset>
Pod Template:
  Labels:  run=pi
... skipping 17 lines ...
Active Jobs:         <none>
Events:              <none>
Successful
message:job.batch/test-job
has:job.batch/test-job
batch.sh:48: Successful get jobs {{range.items}}{{.metadata.name}}{{end}}: 
(BE0114 03:51:13.796857   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 03:51:13.890787   54689 event.go:278] Event(v1.ObjectReference{Kind:"Job", Namespace:"test-jobs", Name:"test-job", UID:"5f6eed06-4e91-408e-b088-0eb497d623ca", APIVersion:"batch/v1", ResourceVersion:"1489", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-job-mfjqd
job.batch/test-job created
E0114 03:51:13.907199   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
batch.sh:53: Successful get job/test-job --namespace=test-jobs {{.metadata.name}}: test-job
(BE0114 03:51:14.027043   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
NAME       COMPLETIONS   DURATION   AGE
test-job   0/1           1s         1s
E0114 03:51:14.152911   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Name:           test-job
Namespace:      test-jobs
Selector:       controller-uid=5f6eed06-4e91-408e-b088-0eb497d623ca
Labels:         controller-uid=5f6eed06-4e91-408e-b088-0eb497d623ca
                job-name=test-job
                run=pi
Annotations:    cronjob.kubernetes.io/instantiate: manual
Controlled By:  CronJob/pi
Parallelism:    1
Completions:    1
Start Time:     Tue, 14 Jan 2020 03:51:13 +0000
Pods Statuses:  1 Running / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  controller-uid=5f6eed06-4e91-408e-b088-0eb497d623ca
           job-name=test-job
           run=pi
  Containers:
   pi:
... skipping 15 lines ...
  Type    Reason            Age   From            Message
  ----    ------            ----  ----            -------
  Normal  SuccessfulCreate  1s    job-controller  Created pod: test-job-mfjqd
job.batch "test-job" deleted
cronjob.batch "pi" deleted
namespace "test-jobs" deleted
E0114 03:51:14.797989   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:14.908650   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:15.028491   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:15.154044   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:15.799305   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:15.909987   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:16.029621   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:16.155012   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:16.800649   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:16.911279   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:17.030889   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:17.156387   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 03:51:17.591053   54689 namespace_controller.go:185] Namespace has been deleted test-service-accounts
E0114 03:51:17.801810   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:17.912873   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:18.032361   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:18.157342   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:18.803084   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:18.913974   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:19.033575   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:19.158561   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_create_job_tests
Running command: run_create_job_tests

+++ Running case: test-cmd.run_create_job_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_create_job_tests
+++ [0114 03:51:19] Creating namespace namespace-1578973879-22943
E0114 03:51:19.804447   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578973879-22943 created
E0114 03:51:19.915181   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
I0114 03:51:20.025998   54689 event.go:278] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1578973879-22943", Name:"test-job", UID:"d2d03f5f-9f27-492e-bf51-0153a2abb4d6", APIVersion:"batch/v1", ResourceVersion:"1513", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-job-c6vcw
job.batch/test-job created
E0114 03:51:20.036483   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
create.sh:86: Successful get job test-job {{(index .spec.template.spec.containers 0).image}}: k8s.gcr.io/nginx:test-cmd
(BE0114 03:51:20.159866   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
job.batch "test-job" deleted
I0114 03:51:20.341349   54689 event.go:278] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1578973879-22943", Name:"test-job-pi", UID:"736f9a47-0a5f-4717-afec-c7dcbc26424c", APIVersion:"batch/v1", ResourceVersion:"1522", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-job-pi-zrb6z
job.batch/test-job-pi created
create.sh:92: Successful get job test-job-pi {{(index .spec.template.spec.containers 0).image}}: k8s.gcr.io/perl
(Bjob.batch "test-job-pi" deleted
kubectl run --generator=cronjob/v1beta1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
cronjob.batch/test-pi created
I0114 03:51:20.753664   54689 event.go:278] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1578973879-22943", Name:"my-pi", UID:"a28da077-86a8-4d08-b2e9-70bbad730f64", APIVersion:"batch/v1", ResourceVersion:"1530", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: my-pi-q8vqj
job.batch/my-pi created
E0114 03:51:20.805620   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:[perl -Mbignum=bpi -wle print bpi(10)]
has:perl -Mbignum=bpi -wle print bpi(10)
E0114 03:51:20.916628   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
job.batch "my-pi" deleted
E0114 03:51:21.037588   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
cronjob.batch "test-pi" deleted
+++ exit code: 0
Recording: run_pod_templates_tests
Running command: run_pod_templates_tests

+++ Running case: test-cmd.run_pod_templates_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_pod_templates_tests
E0114 03:51:21.161446   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ [0114 03:51:21] Creating namespace namespace-1578973881-3891
namespace/namespace-1578973881-3891 created
Context "test" modified.
+++ [0114 03:51:21] Testing pod templates
core.sh:1421: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: 
(BI0114 03:51:21.620406   51249 controller.go:606] quota admission added evaluator for: podtemplates
podtemplate/nginx created
core.sh:1425: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(BE0114 03:51:21.806757   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
NAME    CONTAINERS   IMAGES   POD LABELS
nginx   nginx        nginx    name=nginx
E0114 03:51:21.925831   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:22.038803   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1433: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(Bpodtemplate "nginx" deleted
E0114 03:51:22.162540   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1437: Successful get podtemplate {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
Recording: run_service_tests
Running command: run_service_tests

+++ Running case: test-cmd.run_service_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_service_tests
Context "test" modified.
+++ [0114 03:51:22] Testing kubectl(v1:services)
core.sh:858: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bservice/redis-master created
E0114 03:51:22.807955   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:862: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(BE0114 03:51:22.926949   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Labels:
matched Selector:
matched IP:
matched Port:
matched Endpoints:
... skipping 10 lines ...
IP:                10.0.0.98
Port:              <unset>  6379/TCP
TargetPort:        6379/TCP
Endpoints:         <none>
Session Affinity:  None
Events:            <none>
(BE0114 03:51:23.040206   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:866: Successful describe
Name:              redis-master
Namespace:         default
Labels:            app=redis
                   role=master
                   tier=backend
... skipping 4 lines ...
Port:              <unset>  6379/TCP
TargetPort:        6379/TCP
Endpoints:         <none>
Session Affinity:  None
Events:            <none>
(B
E0114 03:51:23.164175   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:868: Successful describe
Name:              redis-master
Namespace:         default
Labels:            app=redis
                   role=master
                   tier=backend
... skipping 144 lines ...
IP:                10.0.0.98
Port:              <unset>  6379/TCP
TargetPort:        6379/TCP
Endpoints:         <none>
Session Affinity:  None
Events:            <none>
(BE0114 03:51:23.809168   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:882: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend:
(BE0114 03:51:23.928385   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apiVersion: v1
kind: Service
metadata:
  creationTimestamp: null
  labels:
    app: redis
... skipping 5 lines ...
  - port: 6379
    targetPort: 6379
  selector:
    role: padawan
status:
  loadBalancer: {}
E0114 03:51:24.041347   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apiVersion: v1
kind: Service
metadata:
  creationTimestamp: "2020-01-14T03:51:22Z"
  labels:
    app: redis
... skipping 14 lines ...
    role: padawan
  sessionAffinity: None
  type: ClusterIP
status:
  loadBalancer: {}
service/redis-master selector updated
E0114 03:51:24.165297   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:890: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: padawan:
(Bservice/redis-master selector updated
core.sh:894: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend:
(BapiVersion: v1
kind: Service
metadata:
... skipping 17 lines ...
    role: padawan
  sessionAffinity: None
  type: ClusterIP
status:
  loadBalancer: {}
I0114 03:51:24.644555   54689 namespace_controller.go:185] Namespace has been deleted test-jobs
error: you must specify resources by --filename when --local is set.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
core.sh:898: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend:
(BE0114 03:51:24.810444   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:24.929715   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/redis-master selector updated
E0114 03:51:25.042485   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Error from server (Conflict): Operation cannot be fulfilled on services "redis-master": the object has been modified; please apply your changes to the latest version and try again
has:Conflict
E0114 03:51:25.166367   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:911: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(Bservice "redis-master" deleted
core.sh:918: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bcore.sh:922: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BE0114 03:51:25.811597   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/redis-master created
E0114 03:51:25.930966   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:926: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(BE0114 03:51:26.043931   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:930: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(BE0114 03:51:26.167832   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/service-v1-test created
core.sh:951: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:service-v1-test:
(Bservice/service-v1-test replaced
E0114 03:51:26.812735   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:958: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:service-v1-test:
(Bservice "redis-master" deleted
E0114 03:51:26.932284   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "service-v1-test" deleted
E0114 03:51:27.045012   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:966: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BE0114 03:51:27.169184   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:970: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bservice/redis-master created
service/redis-slave created
core.sh:975: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:redis-slave:
(BE0114 03:51:27.814174   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:NAME           RSRC
kubernetes     144
redis-master   1568
redis-slave    1571
has:redis-master
E0114 03:51:27.933612   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:985: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:redis-slave:
(BE0114 03:51:28.048079   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "redis-master" deleted
service "redis-slave" deleted
E0114 03:51:28.170468   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:992: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bcore.sh:996: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bservice/beep-boop created
core.sh:1000: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: beep-boop:kubernetes:
(Bcore.sh:1004: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: beep-boop:kubernetes:
(Bservice "beep-boop" deleted
E0114 03:51:28.815483   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1011: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BE0114 03:51:28.935049   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1015: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bkubectl run --generator=deployment/apps.v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
I0114 03:51:29.041113   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"default", Name:"testmetadata", UID:"51cfc298-753a-4d5b-b5c9-37acc2ce2eb6", APIVersion:"apps/v1", ResourceVersion:"1587", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set testmetadata-bd968f46 to 2
I0114 03:51:29.047903   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"default", Name:"testmetadata-bd968f46", UID:"327239e6-ad94-415b-8416-99e6d556eed6", APIVersion:"apps/v1", ResourceVersion:"1588", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: testmetadata-bd968f46-qs9lg
E0114 03:51:29.049179   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 03:51:29.050470   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"default", Name:"testmetadata-bd968f46", UID:"327239e6-ad94-415b-8416-99e6d556eed6", APIVersion:"apps/v1", ResourceVersion:"1588", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: testmetadata-bd968f46-g2786
service/testmetadata created
deployment.apps/testmetadata created
E0114 03:51:29.171512   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1019: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: testmetadata:
(Bcore.sh:1020: Successful get service testmetadata {{.metadata.annotations}}: map[zone-context:home]
(Bservice/exposemetadata exposed
core.sh:1026: Successful get service exposemetadata {{.metadata.annotations}}: map[zone-context:work]
(Bservice "exposemetadata" deleted
service "testmetadata" deleted
deployment.apps "testmetadata" deleted
+++ exit code: 0
Recording: run_daemonset_tests
Running command: run_daemonset_tests
E0114 03:51:29.817006   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource

+++ Running case: test-cmd.run_daemonset_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_daemonset_tests
+++ [0114 03:51:29] Creating namespace namespace-1578973889-27435
E0114 03:51:29.936220   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578973889-27435 created
Context "test" modified.
+++ [0114 03:51:30] Testing kubectl(v1:daemonsets)
E0114 03:51:30.050504   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:30: Successful get daemonsets {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:51:30.172787   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 03:51:30.337746   51249 controller.go:606] quota admission added evaluator for: daemonsets.apps
daemonset.apps/bind created
I0114 03:51:30.347920   51249 controller.go:606] quota admission added evaluator for: controllerrevisions.apps
apps.sh:34: Successful get daemonsets bind {{.metadata.generation}}: 1
(Bdaemonset.apps/bind configured
apps.sh:37: Successful get daemonsets bind {{.metadata.generation}}: 1
(BE0114 03:51:30.818186   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind image updated
E0114 03:51:30.937507   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:40: Successful get daemonsets bind {{.metadata.generation}}: 2
(BE0114 03:51:31.051986   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind env updated
E0114 03:51:31.174020   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:42: Successful get daemonsets bind {{.metadata.generation}}: 3
(Bdaemonset.apps/bind resource requirements updated
apps.sh:44: Successful get daemonsets bind {{.metadata.generation}}: 4
(Bdaemonset.apps/bind restarted
apps.sh:48: Successful get daemonsets bind {{.metadata.generation}}: 5
(Bdaemonset.apps "bind" deleted
+++ exit code: 0
E0114 03:51:31.819419   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_daemonset_history_tests
Running command: run_daemonset_history_tests

+++ Running case: test-cmd.run_daemonset_history_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_daemonset_history_tests
+++ [0114 03:51:31] Creating namespace namespace-1578973891-30252
E0114 03:51:31.938626   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578973891-30252 created
Context "test" modified.
+++ [0114 03:51:32] Testing kubectl(v1:daemonsets, v1:controllerrevisions)
E0114 03:51:32.053150   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:66: Successful get daemonsets {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:51:32.175206   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind created
apps.sh:70: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[deprecated.daemonset.template.generation:1 kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"DaemonSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"service":"bind"},"name":"bind","namespace":"namespace-1578973891-30252"},"spec":{"selector":{"matchLabels":{"service":"bind"}},"template":{"metadata":{"labels":{"service":"bind"}},"spec":{"affinity":{"podAntiAffinity":{"requiredDuringSchedulingIgnoredDuringExecution":[{"labelSelector":{"matchExpressions":[{"key":"service","operator":"In","values":["bind"]}]},"namespaces":[],"topologyKey":"kubernetes.io/hostname"}]}},"containers":[{"image":"k8s.gcr.io/pause:2.0","name":"kubernetes-pause"}]}},"updateStrategy":{"rollingUpdate":{"maxUnavailable":"10%"},"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
(Bdaemonset.apps/bind skipped rollback (current template already matches revision 1)
apps.sh:73: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:74: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BE0114 03:51:32.820973   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:32.940221   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind configured
E0114 03:51:33.054275   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:77: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
(BE0114 03:51:33.176788   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:78: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:79: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bapps.sh:80: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[deprecated.daemonset.template.generation:1 kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"DaemonSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"service":"bind"},"name":"bind","namespace":"namespace-1578973891-30252"},"spec":{"selector":{"matchLabels":{"service":"bind"}},"template":{"metadata":{"labels":{"service":"bind"}},"spec":{"affinity":{"podAntiAffinity":{"requiredDuringSchedulingIgnoredDuringExecution":[{"labelSelector":{"matchExpressions":[{"key":"service","operator":"In","values":["bind"]}]},"namespaces":[],"topologyKey":"kubernetes.io/hostname"}]}},"containers":[{"image":"k8s.gcr.io/pause:2.0","name":"kubernetes-pause"}]}},"updateStrategy":{"rollingUpdate":{"maxUnavailable":"10%"},"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:map[deprecated.daemonset.template.generation:2 kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"DaemonSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"service":"bind"},"name":"bind","namespace":"namespace-1578973891-30252"},"spec":{"selector":{"matchLabels":{"service":"bind"}},"template":{"metadata":{"labels":{"service":"bind"}},"spec":{"affinity":{"podAntiAffinity":{"requiredDuringSchedulingIgnoredDuringExecution":[{"labelSelector":{"matchExpressions":[{"key":"service","operator":"In","values":["bind"]}]},"namespaces":[],"topologyKey":"kubernetes.io/hostname"}]}},"containers":[{"image":"k8s.gcr.io/pause:latest","name":"kubernetes-pause"},{"image":"k8s.gcr.io/nginx:test-cmd","name":"app"}]}},"updateStrategy":{"rollingUpdate":{"maxUnavailable":"10%"},"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
(Bdaemonset.apps/bind will roll back to Pod Template:
... skipping 5 lines ...
    Host Port:	<none>
    Environment:	<none>
    Mounts:	<none>
  Volumes:	<none>
 (dry run)
apps.sh:83: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
(BE0114 03:51:33.822140   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:84: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0114 03:51:33.941383   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:85: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(BE0114 03:51:34.055527   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind rolled back
E0114 03:51:34.178063   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:88: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:89: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BSuccessful
message:error: unable to find specified revision 1000000 in history
has:unable to find specified revision
apps.sh:93: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(BE0114 03:51:34.823395   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:94: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BE0114 03:51:34.942640   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind rolled back
E0114 03:51:35.038524   54689 daemon_controller.go:291] namespace-1578973891-30252/bind failed with : error storing status for daemon set &v1.DaemonSet{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"bind", GenerateName:"", Namespace:"namespace-1578973891-30252", SelfLink:"/apis/apps/v1/namespaces/namespace-1578973891-30252/daemonsets/bind", UID:"20dd70af-bae0-4464-810c-c88a2dce01b9", ResourceVersion:"1660", Generation:4, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63714570692, loc:(*time.Location)(0x6b26ba0)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string{"deprecated.daemonset.template.generation":"4", "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"DaemonSet\",\"metadata\":{\"annotations\":{\"kubernetes.io/change-cause\":\"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true\"},\"labels\":{\"service\":\"bind\"},\"name\":\"bind\",\"namespace\":\"namespace-1578973891-30252\"},\"spec\":{\"selector\":{\"matchLabels\":{\"service\":\"bind\"}},\"template\":{\"metadata\":{\"labels\":{\"service\":\"bind\"}},\"spec\":{\"affinity\":{\"podAntiAffinity\":{\"requiredDuringSchedulingIgnoredDuringExecution\":[{\"labelSelector\":{\"matchExpressions\":[{\"key\":\"service\",\"operator\":\"In\",\"values\":[\"bind\"]}]},\"namespaces\":[],\"topologyKey\":\"kubernetes.io/hostname\"}]}},\"containers\":[{\"image\":\"k8s.gcr.io/pause:latest\",\"name\":\"kubernetes-pause\"},{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"app\"}]}},\"updateStrategy\":{\"rollingUpdate\":{\"maxUnavailable\":\"10%\"},\"type\":\"RollingUpdate\"}}}\n", "kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"}, OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"kube-controller-manager", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc0016f5320), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc0016f5360)}, v1.ManagedFieldsEntry{Manager:"kubectl", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc0016f53a0), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc0016f53e0)}}}, Spec:v1.DaemonSetSpec{Selector:(*v1.LabelSelector)(0xc0016f5420), Template:v1.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"kubernetes-pause", Image:"k8s.gcr.io/pause:latest", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}, v1.Container{Name:"app", Image:"k8s.gcr.io/nginx:test-cmd", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc002363998), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc002b68300), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(0xc0016f5440), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration(nil), HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(nil), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}}, UpdateStrategy:v1.DaemonSetUpdateStrategy{Type:"RollingUpdate", RollingUpdate:(*v1.RollingUpdateDaemonSet)(0xc000edca00)}, MinReadySeconds:0, RevisionHistoryLimit:(*int32)(0xc0023639ec)}, Status:v1.DaemonSetStatus{CurrentNumberScheduled:0, NumberMisscheduled:0, DesiredNumberScheduled:0, NumberReady:0, ObservedGeneration:3, UpdatedNumberScheduled:0, NumberAvailable:0, NumberUnavailable:0, CollisionCount:(*int32)(nil), Conditions:[]v1.DaemonSetCondition(nil)}}: Operation cannot be fulfilled on daemonsets.apps "bind": the object has been modified; please apply your changes to the latest version and try again
E0114 03:51:35.056943   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:97: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
(BE0114 03:51:35.179349   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:98: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:99: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bdaemonset.apps "bind" deleted
+++ exit code: 0
Recording: run_rc_tests
Running command: run_rc_tests

+++ Running case: test-cmd.run_rc_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_rc_tests
+++ [0114 03:51:35] Creating namespace namespace-1578973895-5741
E0114 03:51:35.824798   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578973895-5741 created
E0114 03:51:35.943974   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0114 03:51:35] Testing kubectl(v1:replicationcontrollers)
E0114 03:51:36.057987   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1052: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:51:36.180882   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend created
I0114 03:51:36.323228   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973895-5741", Name:"frontend", UID:"512542f4-f0db-4f59-96b3-c4d157f5c826", APIVersion:"v1", ResourceVersion:"1670", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-lf8df
I0114 03:51:36.339204   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973895-5741", Name:"frontend", UID:"512542f4-f0db-4f59-96b3-c4d157f5c826", APIVersion:"v1", ResourceVersion:"1670", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-7644p
I0114 03:51:36.339318   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973895-5741", Name:"frontend", UID:"512542f4-f0db-4f59-96b3-c4d157f5c826", APIVersion:"v1", ResourceVersion:"1670", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-cgsss
replicationcontroller "frontend" deleted
core.sh:1057: Successful get pods -l "name=frontend" {{range.items}}{{.metadata.name}}:{{end}}: 
(Bcore.sh:1061: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:51:36.825990   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend created
I0114 03:51:36.856005   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973895-5741", Name:"frontend", UID:"4466f0e3-72b4-4110-9f13-a26bd74feb4d", APIVersion:"v1", ResourceVersion:"1686", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-lxhrt
I0114 03:51:36.858377   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973895-5741", Name:"frontend", UID:"4466f0e3-72b4-4110-9f13-a26bd74feb4d", APIVersion:"v1", ResourceVersion:"1686", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-f6b9t
I0114 03:51:36.859970   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973895-5741", Name:"frontend", UID:"4466f0e3-72b4-4110-9f13-a26bd74feb4d", APIVersion:"v1", ResourceVersion:"1686", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-86wpk
E0114 03:51:36.945186   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1065: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(BE0114 03:51:37.059378   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Pod Template:
matched Labels:
matched Selector:
matched Replicas:
matched Pods Status:
... skipping 4 lines ...
Namespace:    namespace-1578973895-5741
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 9 lines ...
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-lxhrt
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-f6b9t
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-86wpk
(BE0114 03:51:37.182233   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1069: Successful describe
Name:         frontend
Namespace:    namespace-1578973895-5741
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 18 lines ...
Namespace:    namespace-1578973895-5741
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 12 lines ...
Namespace:    namespace-1578973895-5741
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 27 lines ...
Namespace:    namespace-1578973895-5741
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
Namespace:    namespace-1578973895-5741
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 9 lines ...
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-lxhrt
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-f6b9t
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-86wpk
(BE0114 03:51:37.827309   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:         frontend
Namespace:    namespace-1578973895-5741
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 3 lines ...
      cpu:     100m
      memory:  100Mi
    Environment:
      GET_HOSTS_FROM:  dns
    Mounts:            <none>
  Volumes:             <none>
(BE0114 03:51:37.946365   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:         frontend
Namespace:    namespace-1578973895-5741
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 9 lines ...
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  2s    replication-controller  Created pod: frontend-lxhrt
  Normal  SuccessfulCreate  2s    replication-controller  Created pod: frontend-f6b9t
  Normal  SuccessfulCreate  2s    replication-controller  Created pod: frontend-86wpk
(BE0114 03:51:38.060792   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1085: Successful get rc frontend {{.spec.replicas}}: 3
(BE0114 03:51:38.183480   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend scaled
E0114 03:51:38.241852   54689 replica_set.go:199] ReplicaSet has no controller: &ReplicaSet{ObjectMeta:{frontend  namespace-1578973895-5741 /api/v1/namespaces/namespace-1578973895-5741/replicationcontrollers/frontend 4466f0e3-72b4-4110-9f13-a26bd74feb4d 1695 2 2020-01-14 03:51:36 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []},Spec:ReplicaSetSpec{Replicas:*2,Selector:&v1.LabelSelector{MatchLabels:map[string]string{app: guestbook,tier: frontend,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{      0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []} {[] [] [{php-redis gcr.io/google_samples/gb-frontend:v4 [] []  [{ 0 80 TCP }] [] [{GET_HOSTS_FROM dns nil}] {map[] map[cpu:{{100 -3} {<nil>} 100m DecimalSI} memory:{{104857600 0} {<nil>} 100Mi BinarySI}]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent nil false false false}] [] Always 0xc002b3aa98 <nil> ClusterFirst map[]   <nil>  false false false <nil> PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,} []   nil default-scheduler [] []  <nil> nil [] <nil> <nil> <nil> map[] []}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:3,FullyLabeledReplicas:3,ObservedGeneration:1,ReadyReplicas:0,AvailableReplicas:0,Conditions:[]ReplicaSetCondition{},},}
I0114 03:51:38.249262   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973895-5741", Name:"frontend", UID:"4466f0e3-72b4-4110-9f13-a26bd74feb4d", APIVersion:"v1", ResourceVersion:"1695", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-lxhrt
core.sh:1089: Successful get rc frontend {{.spec.replicas}}: 2
(Bcore.sh:1093: Successful get rc frontend {{.spec.replicas}}: 2
(Berror: Expected replicas to be 3, was 2
core.sh:1097: Successful get rc frontend {{.spec.replicas}}: 2
(Bcore.sh:1101: Successful get rc frontend {{.spec.replicas}}: 2
(BE0114 03:51:38.828707   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend scaled
I0114 03:51:38.891292   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973895-5741", Name:"frontend", UID:"4466f0e3-72b4-4110-9f13-a26bd74feb4d", APIVersion:"v1", ResourceVersion:"1703", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-bjcxl
E0114 03:51:38.947463   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1105: Successful get rc frontend {{.spec.replicas}}: 3
(BE0114 03:51:39.061934   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1109: Successful get rc frontend {{.spec.replicas}}: 3
(BE0114 03:51:39.184693   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:39.205495   54689 replica_set.go:199] ReplicaSet has no controller: &ReplicaSet{ObjectMeta:{frontend  namespace-1578973895-5741 /api/v1/namespaces/namespace-1578973895-5741/replicationcontrollers/frontend 4466f0e3-72b4-4110-9f13-a26bd74feb4d 1708 4 2020-01-14 03:51:36 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []},Spec:ReplicaSetSpec{Replicas:*2,Selector:&v1.LabelSelector{MatchLabels:map[string]string{app: guestbook,tier: frontend,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{      0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []} {[] [] [{php-redis gcr.io/google_samples/gb-frontend:v4 [] []  [{ 0 80 TCP }] [] [{GET_HOSTS_FROM dns nil}] {map[] map[cpu:{{100 -3} {<nil>} 100m DecimalSI} memory:{{104857600 0} {<nil>} 100Mi BinarySI}]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent nil false false false}] [] Always 0xc00293a9e8 <nil> ClusterFirst map[]   <nil>  false false false <nil> PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,} []   nil default-scheduler [] []  <nil> nil [] <nil> <nil> <nil> map[] []}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:3,FullyLabeledReplicas:3,ObservedGeneration:3,ReadyReplicas:0,AvailableReplicas:0,Conditions:[]ReplicaSetCondition{},},}
replicationcontroller/frontend scaled
I0114 03:51:39.210804   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973895-5741", Name:"frontend", UID:"4466f0e3-72b4-4110-9f13-a26bd74feb4d", APIVersion:"v1", ResourceVersion:"1708", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-bjcxl
core.sh:1113: Successful get rc frontend {{.spec.replicas}}: 2
(Breplicationcontroller "frontend" deleted
replicationcontroller/redis-master created
I0114 03:51:39.623122   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973895-5741", Name:"redis-master", UID:"f7e21e7e-9702-4629-a798-1ca202a3951f", APIVersion:"v1", ResourceVersion:"1716", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-tskhg
E0114 03:51:39.829900   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/redis-slave created
I0114 03:51:39.839446   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973895-5741", Name:"redis-slave", UID:"74f41a85-fae5-4b93-bd08-4184fef09ed8", APIVersion:"v1", ResourceVersion:"1725", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-s76rq
I0114 03:51:39.853941   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973895-5741", Name:"redis-slave", UID:"74f41a85-fae5-4b93-bd08-4184fef09ed8", APIVersion:"v1", ResourceVersion:"1725", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-jm8wz
E0114 03:51:39.948559   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/redis-master scaled
I0114 03:51:39.952325   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973895-5741", Name:"redis-master", UID:"f7e21e7e-9702-4629-a798-1ca202a3951f", APIVersion:"v1", ResourceVersion:"1732", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-ptt5q
replicationcontroller/redis-slave scaled
I0114 03:51:39.958483   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973895-5741", Name:"redis-master", UID:"f7e21e7e-9702-4629-a798-1ca202a3951f", APIVersion:"v1", ResourceVersion:"1732", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-nzpc6
I0114 03:51:39.958888   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973895-5741", Name:"redis-master", UID:"f7e21e7e-9702-4629-a798-1ca202a3951f", APIVersion:"v1", ResourceVersion:"1732", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-shdsx
I0114 03:51:39.967222   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973895-5741", Name:"redis-slave", UID:"74f41a85-fae5-4b93-bd08-4184fef09ed8", APIVersion:"v1", ResourceVersion:"1734", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-j7k9v
I0114 03:51:39.971556   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973895-5741", Name:"redis-slave", UID:"74f41a85-fae5-4b93-bd08-4184fef09ed8", APIVersion:"v1", ResourceVersion:"1734", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-t24f8
E0114 03:51:40.063028   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1123: Successful get rc redis-master {{.spec.replicas}}: 4
(BE0114 03:51:40.185837   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1124: Successful get rc redis-slave {{.spec.replicas}}: 4
(Breplicationcontroller "redis-master" deleted
replicationcontroller "redis-slave" deleted
deployment.apps/nginx-deployment created
I0114 03:51:40.522317   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973895-5741", Name:"nginx-deployment", UID:"dd13ebcc-a73c-43be-80dc-cf7aee277493", APIVersion:"apps/v1", ResourceVersion:"1768", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
I0114 03:51:40.529030   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973895-5741", Name:"nginx-deployment-6986c7bc94", UID:"bd7c9582-4d43-4626-8171-07cc03609939", APIVersion:"apps/v1", ResourceVersion:"1769", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-r78dg
I0114 03:51:40.531582   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973895-5741", Name:"nginx-deployment-6986c7bc94", UID:"bd7c9582-4d43-4626-8171-07cc03609939", APIVersion:"apps/v1", ResourceVersion:"1769", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-t9skz
I0114 03:51:40.535343   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973895-5741", Name:"nginx-deployment-6986c7bc94", UID:"bd7c9582-4d43-4626-8171-07cc03609939", APIVersion:"apps/v1", ResourceVersion:"1769", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-8gc4l
deployment.apps/nginx-deployment scaled
I0114 03:51:40.629247   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973895-5741", Name:"nginx-deployment", UID:"dd13ebcc-a73c-43be-80dc-cf7aee277493", APIVersion:"apps/v1", ResourceVersion:"1782", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-6986c7bc94 to 1
I0114 03:51:40.637537   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973895-5741", Name:"nginx-deployment-6986c7bc94", UID:"bd7c9582-4d43-4626-8171-07cc03609939", APIVersion:"apps/v1", ResourceVersion:"1783", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-6986c7bc94-t9skz
I0114 03:51:40.639971   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973895-5741", Name:"nginx-deployment-6986c7bc94", UID:"bd7c9582-4d43-4626-8171-07cc03609939", APIVersion:"apps/v1", ResourceVersion:"1783", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-6986c7bc94-r78dg
core.sh:1133: Successful get deployment nginx-deployment {{.spec.replicas}}: 1
(BE0114 03:51:40.831118   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx-deployment" deleted
E0114 03:51:40.949553   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:service/expose-test-deployment exposed
has:service/expose-test-deployment exposed
E0114 03:51:41.064363   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "expose-test-deployment" deleted
Successful
message:error: couldn't retrieve selectors via --selector flag or introspection: invalid deployment: no selectors, therefore cannot be exposed
See 'kubectl expose -h' for help and examples
has:invalid deployment: no selectors
E0114 03:51:41.187184   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment created
I0114 03:51:41.377170   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973895-5741", Name:"nginx-deployment", UID:"7b530df0-6ba0-4a8d-9c31-66242784d29e", APIVersion:"apps/v1", ResourceVersion:"1806", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
I0114 03:51:41.382188   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973895-5741", Name:"nginx-deployment-6986c7bc94", UID:"b2ad9962-485e-4fb9-ab56-d62e8d94e16e", APIVersion:"apps/v1", ResourceVersion:"1807", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-7nfj7
I0114 03:51:41.386548   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973895-5741", Name:"nginx-deployment-6986c7bc94", UID:"b2ad9962-485e-4fb9-ab56-d62e8d94e16e", APIVersion:"apps/v1", ResourceVersion:"1807", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-hpldj
I0114 03:51:41.387603   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973895-5741", Name:"nginx-deployment-6986c7bc94", UID:"b2ad9962-485e-4fb9-ab56-d62e8d94e16e", APIVersion:"apps/v1", ResourceVersion:"1807", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-z77cg
core.sh:1152: Successful get deployment nginx-deployment {{.spec.replicas}}: 3
(Bservice/nginx-deployment exposed
core.sh:1156: Successful get service nginx-deployment {{(index .spec.ports 0).port}}: 80
(Bdeployment.apps "nginx-deployment" deleted
service "nginx-deployment" deleted
E0114 03:51:41.832222   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:41.950827   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend created
I0114 03:51:42.016495   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973895-5741", Name:"frontend", UID:"2d133b6f-3aad-43ba-9bea-91f6cabcf026", APIVersion:"v1", ResourceVersion:"1836", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-q5t5s
I0114 03:51:42.020158   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973895-5741", Name:"frontend", UID:"2d133b6f-3aad-43ba-9bea-91f6cabcf026", APIVersion:"v1", ResourceVersion:"1836", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-wdh87
I0114 03:51:42.020228   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973895-5741", Name:"frontend", UID:"2d133b6f-3aad-43ba-9bea-91f6cabcf026", APIVersion:"v1", ResourceVersion:"1836", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-5724m
E0114 03:51:42.065550   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1163: Successful get rc frontend {{.spec.replicas}}: 3
(BE0114 03:51:42.189357   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/frontend exposed
core.sh:1167: Successful get service frontend {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(Bservice/frontend-2 exposed
core.sh:1171: Successful get service frontend-2 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 443
(Bpod/valid-pod created
E0114 03:51:42.833463   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/frontend-3 exposed
E0114 03:51:42.952234   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1176: Successful get service frontend-3 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 444
(BE0114 03:51:43.067195   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/frontend-4 exposed
E0114 03:51:43.190477   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1180: Successful get service frontend-4 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: default 80
(Bservice/frontend-5 exposed
core.sh:1184: Successful get service frontend-5 {{(index .spec.ports 0).port}}: 80
(Bpod "valid-pod" deleted
service "frontend" deleted
service "frontend-2" deleted
service "frontend-3" deleted
service "frontend-4" deleted
service "frontend-5" deleted
Successful
message:error: cannot expose a Node
has:cannot expose
E0114 03:51:43.834591   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:The Service "invalid-large-service-name-that-has-more-than-sixty-three-characters" is invalid: metadata.name: Invalid value: "invalid-large-service-name-that-has-more-than-sixty-three-characters": must be no more than 63 characters
has:metadata.name: Invalid value
E0114 03:51:43.953369   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:service/kubernetes-serve-hostname-testing-sixty-three-characters-in-len exposed
has:kubernetes-serve-hostname-testing-sixty-three-characters-in-len exposed
E0114 03:51:44.068351   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "kubernetes-serve-hostname-testing-sixty-three-characters-in-len" deleted
E0114 03:51:44.191952   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:service/etcd-server exposed
has:etcd-server exposed
core.sh:1214: Successful get service etcd-server {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: port-1 2380
(Bcore.sh:1215: Successful get service etcd-server {{(index .spec.ports 1).name}} {{(index .spec.ports 1).port}}: port-2 2379
(Bservice "etcd-server" deleted
core.sh:1221: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(BE0114 03:51:44.836008   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller "frontend" deleted
E0114 03:51:44.954706   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1225: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:51:45.069600   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1229: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:51:45.193190   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend created
I0114 03:51:45.315919   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973895-5741", Name:"frontend", UID:"5a8de198-831b-481b-84c7-a513e8203454", APIVersion:"v1", ResourceVersion:"1901", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-98mhn
I0114 03:51:45.322244   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973895-5741", Name:"frontend", UID:"5a8de198-831b-481b-84c7-a513e8203454", APIVersion:"v1", ResourceVersion:"1901", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-hhsfb
I0114 03:51:45.322285   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973895-5741", Name:"frontend", UID:"5a8de198-831b-481b-84c7-a513e8203454", APIVersion:"v1", ResourceVersion:"1901", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-sgpvb
replicationcontroller/redis-slave created
I0114 03:51:45.557943   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973895-5741", Name:"redis-slave", UID:"bf187c2f-6d91-49b2-b46a-de8e36ea36f2", APIVersion:"v1", ResourceVersion:"1910", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-vtcxr
I0114 03:51:45.563120   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973895-5741", Name:"redis-slave", UID:"bf187c2f-6d91-49b2-b46a-de8e36ea36f2", APIVersion:"v1", ResourceVersion:"1910", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-bv9l4
core.sh:1234: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(Bcore.sh:1238: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(BE0114 03:51:45.837395   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller "frontend" deleted
replicationcontroller "redis-slave" deleted
E0114 03:51:45.955722   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:46.070644   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1242: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:51:46.194525   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1246: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/frontend created
I0114 03:51:46.398024   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973895-5741", Name:"frontend", UID:"20d1bacb-ca35-4c1f-8550-6a3eeb7eb3c7", APIVersion:"v1", ResourceVersion:"1931", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-lhvft
I0114 03:51:46.401491   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973895-5741", Name:"frontend", UID:"20d1bacb-ca35-4c1f-8550-6a3eeb7eb3c7", APIVersion:"v1", ResourceVersion:"1931", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-s9nqh
I0114 03:51:46.402019   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973895-5741", Name:"frontend", UID:"20d1bacb-ca35-4c1f-8550-6a3eeb7eb3c7", APIVersion:"v1", ResourceVersion:"1931", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-s84s8
core.sh:1249: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Bhorizontalpodautoscaler.autoscaling/frontend autoscaled
core.sh:1252: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 70
(Bhorizontalpodautoscaler.autoscaling "frontend" deleted
E0114 03:51:46.838878   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
horizontalpodautoscaler.autoscaling/frontend autoscaled
E0114 03:51:46.956952   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1256: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(BE0114 03:51:47.072021   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
horizontalpodautoscaler.autoscaling "frontend" deleted
Error: required flag(s) "max" not set
E0114 03:51:47.195896   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller "frontend" deleted
core.sh:1265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BapiVersion: apps/v1
kind: Deployment
metadata:
  creationTimestamp: null
... skipping 24 lines ...
          limits:
            cpu: 300m
          requests:
            cpu: 300m
      terminationGracePeriodSeconds: 0
status: {}
Error from server (NotFound): deployments.apps "nginx-deployment-resources" not found
deployment.apps/nginx-deployment-resources created
I0114 03:51:47.796539   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973895-5741", Name:"nginx-deployment-resources", UID:"f551ea23-0f98-464f-84eb-2ba2dab586a0", APIVersion:"apps/v1", ResourceVersion:"1951", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-67f8cfff5 to 3
I0114 03:51:47.809532   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973895-5741", Name:"nginx-deployment-resources-67f8cfff5", UID:"1f2bcb28-488c-4b3f-8f21-c43c341e0881", APIVersion:"apps/v1", ResourceVersion:"1952", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-67f8cfff5-6jq7p
I0114 03:51:47.812784   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973895-5741", Name:"nginx-deployment-resources-67f8cfff5", UID:"1f2bcb28-488c-4b3f-8f21-c43c341e0881", APIVersion:"apps/v1", ResourceVersion:"1952", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-67f8cfff5-xjx9b
I0114 03:51:47.814461   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973895-5741", Name:"nginx-deployment-resources-67f8cfff5", UID:"1f2bcb28-488c-4b3f-8f21-c43c341e0881", APIVersion:"apps/v1", ResourceVersion:"1952", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-67f8cfff5-f8dx5
E0114 03:51:47.840297   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1271: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment-resources:
(BE0114 03:51:47.958247   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1272: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0114 03:51:48.073183   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1273: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(BE0114 03:51:48.197135   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment-resources resource requirements updated
I0114 03:51:48.256466   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973895-5741", Name:"nginx-deployment-resources", UID:"f551ea23-0f98-464f-84eb-2ba2dab586a0", APIVersion:"apps/v1", ResourceVersion:"1965", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-55c547f795 to 1
I0114 03:51:48.261220   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973895-5741", Name:"nginx-deployment-resources-55c547f795", UID:"ff2fb80a-b869-4e70-9f03-5fd6aa88be85", APIVersion:"apps/v1", ResourceVersion:"1966", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-55c547f795-klq5b
core.sh:1276: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 100m:
(Bcore.sh:1277: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m:
(Berror: unable to find container named redis
deployment.apps/nginx-deployment-resources resource requirements updated
I0114 03:51:48.716096   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973895-5741", Name:"nginx-deployment-resources", UID:"f551ea23-0f98-464f-84eb-2ba2dab586a0", APIVersion:"apps/v1", ResourceVersion:"1977", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-resources-67f8cfff5 to 2
I0114 03:51:48.724263   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973895-5741", Name:"nginx-deployment-resources-67f8cfff5", UID:"1f2bcb28-488c-4b3f-8f21-c43c341e0881", APIVersion:"apps/v1", ResourceVersion:"1981", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-resources-67f8cfff5-6jq7p
I0114 03:51:48.726688   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973895-5741", Name:"nginx-deployment-resources", UID:"f551ea23-0f98-464f-84eb-2ba2dab586a0", APIVersion:"apps/v1", ResourceVersion:"1980", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-6d86564b45 to 1
I0114 03:51:48.733403   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973895-5741", Name:"nginx-deployment-resources-6d86564b45", UID:"f190b505-0367-427c-8bee-de881113dfc2", APIVersion:"apps/v1", ResourceVersion:"1985", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-6d86564b45-44lfw
core.sh:1282: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
(BE0114 03:51:48.841685   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1283: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m:
(BE0114 03:51:48.959381   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment-resources resource requirements updated
I0114 03:51:49.070511   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973895-5741", Name:"nginx-deployment-resources", UID:"f551ea23-0f98-464f-84eb-2ba2dab586a0", APIVersion:"apps/v1", ResourceVersion:"1997", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-resources-67f8cfff5 to 1
E0114 03:51:49.076109   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 03:51:49.092667   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973895-5741", Name:"nginx-deployment-resources-67f8cfff5", UID:"1f2bcb28-488c-4b3f-8f21-c43c341e0881", APIVersion:"apps/v1", ResourceVersion:"2001", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-resources-67f8cfff5-xjx9b
I0114 03:51:49.092708   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973895-5741", Name:"nginx-deployment-resources", UID:"f551ea23-0f98-464f-84eb-2ba2dab586a0", APIVersion:"apps/v1", ResourceVersion:"2000", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-6c478d4fdb to 1
I0114 03:51:49.099279   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973895-5741", Name:"nginx-deployment-resources-6c478d4fdb", UID:"fa11ca33-d6c6-442d-9d2b-f313d3b214e9", APIVersion:"apps/v1", ResourceVersion:"2004", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-6c478d4fdb-2f4n2
E0114 03:51:49.198293   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1286: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
(Bcore.sh:1287: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 300m:
(Bcore.sh:1288: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.requests.cpu}}:{{end}}: 300m:
(BapiVersion: apps/v1
kind: Deployment
metadata:
... skipping 171 lines ...
    status: "True"
    type: Progressing
  observedGeneration: 4
  replicas: 4
  unavailableReplicas: 4
  updatedReplicas: 1
error: you must specify resources by --filename when --local is set.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
core.sh:1292: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
(Bcore.sh:1293: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 300m:
(BE0114 03:51:49.843014   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1294: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.requests.cpu}}:{{end}}: 300m:
(BE0114 03:51:49.960898   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx-deployment-resources" deleted
+++ exit code: 0
E0114 03:51:50.077414   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_deployment_tests
Running command: run_deployment_tests

+++ Running case: test-cmd.run_deployment_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_deployment_tests
+++ [0114 03:51:50] Creating namespace namespace-1578973910-8468
E0114 03:51:50.199469   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578973910-8468 created
Context "test" modified.
+++ [0114 03:51:50] Testing deployments
deployment.apps/test-nginx-extensions created
I0114 03:51:50.423846   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973910-8468", Name:"test-nginx-extensions", UID:"545fb17b-ba65-4574-9f00-395d2c1cf1cf", APIVersion:"apps/v1", ResourceVersion:"2034", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-nginx-extensions-5559c76db7 to 1
I0114 03:51:50.429715   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"test-nginx-extensions-5559c76db7", UID:"b6d848f8-9432-4baf-8526-ee5a19139250", APIVersion:"apps/v1", ResourceVersion:"2035", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-nginx-extensions-5559c76db7-k5v8k
... skipping 2 lines ...
message:10
has not:2
Successful
message:apps/v1
has:apps/v1
deployment.apps "test-nginx-extensions" deleted
E0114 03:51:50.844062   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/test-nginx-apps created
I0114 03:51:50.936844   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973910-8468", Name:"test-nginx-apps", UID:"7496b238-48a2-4723-8379-8009f516484b", APIVersion:"apps/v1", ResourceVersion:"2048", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-nginx-apps-79b9bd9585 to 1
I0114 03:51:50.939278   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"test-nginx-apps-79b9bd9585", UID:"e78f5985-7c6b-416a-9086-f68734df19f4", APIVersion:"apps/v1", ResourceVersion:"2049", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-nginx-apps-79b9bd9585-2mpgt
E0114 03:51:50.962144   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:198: Successful get deploy test-nginx-apps {{(index .spec.template.spec.containers 0).name}}: nginx
(BE0114 03:51:51.078672   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:10
has:10
E0114 03:51:51.200716   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:apps/v1
has:apps/v1
matched Name:
matched Pod Template:
matched Labels:
... skipping 10 lines ...
                pod-template-hash=79b9bd9585
Annotations:    deployment.kubernetes.io/desired-replicas: 1
                deployment.kubernetes.io/max-replicas: 2
                deployment.kubernetes.io/revision: 1
Controlled By:  Deployment/test-nginx-apps
Replicas:       1 current / 1 desired
Pods Status:    0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=test-nginx-apps
           pod-template-hash=79b9bd9585
  Containers:
   nginx:
    Image:        k8s.gcr.io/nginx:test-cmd
... skipping 38 lines ...
Events:           <none>
(Bdeployment.apps "test-nginx-apps" deleted
apps.sh:214: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx-with-command created
I0114 03:51:51.819227   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973910-8468", Name:"nginx-with-command", UID:"426f51bf-b672-4ac5-bbc9-34b8a30e0a02", APIVersion:"apps/v1", ResourceVersion:"2062", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-with-command-757c6f58dd to 1
I0114 03:51:51.825125   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx-with-command-757c6f58dd", UID:"e98819be-c27c-4db1-91dc-b0c58513e9b3", APIVersion:"apps/v1", ResourceVersion:"2063", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-with-command-757c6f58dd-cqcb9
E0114 03:51:51.845046   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:218: Successful get deploy nginx-with-command {{(index .spec.template.spec.containers 0).name}}: nginx
(BE0114 03:51:51.963305   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx-with-command" deleted
E0114 03:51:52.079998   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:224: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:51:52.201835   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/deployment-with-unixuserid created
I0114 03:51:52.336192   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973910-8468", Name:"deployment-with-unixuserid", UID:"674494e9-792e-4ad5-a7fb-c492ae71c0f2", APIVersion:"apps/v1", ResourceVersion:"2076", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set deployment-with-unixuserid-8fcdfc94f to 1
I0114 03:51:52.338729   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"deployment-with-unixuserid-8fcdfc94f", UID:"59166120-f3f3-4f46-9f45-36342988fae7", APIVersion:"apps/v1", ResourceVersion:"2077", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: deployment-with-unixuserid-8fcdfc94f-cl7dm
apps.sh:228: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: deployment-with-unixuserid:
(Bdeployment.apps "deployment-with-unixuserid" deleted
apps.sh:235: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:51:52.846285   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment created
I0114 03:51:52.868256   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment", UID:"8df8a903-9a42-4d24-99ad-19d40b2da35b", APIVersion:"apps/v1", ResourceVersion:"2092", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
I0114 03:51:52.870792   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment-6986c7bc94", UID:"a93d641e-3f19-4e5c-8e88-979a820e4a87", APIVersion:"apps/v1", ResourceVersion:"2093", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-8qr8v
I0114 03:51:52.875317   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment-6986c7bc94", UID:"a93d641e-3f19-4e5c-8e88-979a820e4a87", APIVersion:"apps/v1", ResourceVersion:"2093", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-rl9w2
I0114 03:51:52.876434   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment-6986c7bc94", UID:"a93d641e-3f19-4e5c-8e88-979a820e4a87", APIVersion:"apps/v1", ResourceVersion:"2093", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-kjcdj
E0114 03:51:52.965635   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:239: Successful get rs {{range.items}}{{.spec.replicas}}{{end}}: 3
(BE0114 03:51:53.080927   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx-deployment" deleted
E0114 03:51:53.202994   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:242: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:246: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:247: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx-deployment created
I0114 03:51:53.524760   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment", UID:"673f758c-b639-4b02-8516-a03fb412eba1", APIVersion:"apps/v1", ResourceVersion:"2114", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-7f6fc565b9 to 1
I0114 03:51:53.528215   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment-7f6fc565b9", UID:"712a3aa4-e6d1-46b7-b03a-896e80a17167", APIVersion:"apps/v1", ResourceVersion:"2115", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-7f6fc565b9-t8f9k
apps.sh:251: Successful get rs {{range.items}}{{.spec.replicas}}{{end}}: 1
(Bdeployment.apps "nginx-deployment" deleted
E0114 03:51:53.847576   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:256: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:257: Successful get rs {{range.items}}{{.spec.replicas}}{{end}}: 1
(BE0114 03:51:53.966889   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:54.082293   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps "nginx-deployment-7f6fc565b9" deleted
E0114 03:51:54.204711   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx-deployment created
I0114 03:51:54.497220   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment", UID:"22c57cb2-feca-4005-a9eb-6b8675612e89", APIVersion:"apps/v1", ResourceVersion:"2132", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
I0114 03:51:54.503113   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment-6986c7bc94", UID:"473419fe-7877-4702-b980-7821ca3c951f", APIVersion:"apps/v1", ResourceVersion:"2133", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-6cdtl
I0114 03:51:54.507955   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment-6986c7bc94", UID:"473419fe-7877-4702-b980-7821ca3c951f", APIVersion:"apps/v1", ResourceVersion:"2133", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-rbm2d
I0114 03:51:54.509486   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment-6986c7bc94", UID:"473419fe-7877-4702-b980-7821ca3c951f", APIVersion:"apps/v1", ResourceVersion:"2133", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-tvmng
apps.sh:268: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment:
(Bhorizontalpodautoscaler.autoscaling/nginx-deployment autoscaled
E0114 03:51:54.848863   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:271: Successful get hpa nginx-deployment {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(Bhorizontalpodautoscaler.autoscaling "nginx-deployment" deleted
E0114 03:51:54.968187   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx-deployment" deleted
E0114 03:51:55.083501   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:55.205966   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:279: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx created
I0114 03:51:55.427072   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973910-8468", Name:"nginx", UID:"271c360d-2282-4058-b67a-d858db487e38", APIVersion:"apps/v1", ResourceVersion:"2156", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-f87d999f7 to 3
I0114 03:51:55.431580   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx-f87d999f7", UID:"9c718491-d6ab-4686-8ca1-da4c12be42ca", APIVersion:"apps/v1", ResourceVersion:"2157", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-mw8bh
I0114 03:51:55.437939   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx-f87d999f7", UID:"9c718491-d6ab-4686-8ca1-da4c12be42ca", APIVersion:"apps/v1", ResourceVersion:"2157", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-88jgq
I0114 03:51:55.440661   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx-f87d999f7", UID:"9c718491-d6ab-4686-8ca1-da4c12be42ca", APIVersion:"apps/v1", ResourceVersion:"2157", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-s4pcc
apps.sh:283: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(Bapps.sh:284: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bdeployment.apps/nginx skipped rollback (current template already matches revision 1)
E0114 03:51:55.850090   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:287: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0114 03:51:55.969518   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:56.085137   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
deployment.apps/nginx configured
I0114 03:51:56.122477   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973910-8468", Name:"nginx", UID:"271c360d-2282-4058-b67a-d858db487e38", APIVersion:"apps/v1", ResourceVersion:"2170", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-78487f9fd7 to 1
I0114 03:51:56.126941   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx-78487f9fd7", UID:"824a0f33-b971-4a57-8a5d-1a5731aeff35", APIVersion:"apps/v1", ResourceVersion:"2171", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-78487f9fd7-zg596
E0114 03:51:56.207086   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:290: Successful get deployment.apps {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(B    Image:	k8s.gcr.io/nginx:test-cmd
apps.sh:293: Successful get deployment.apps {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bdeployment.apps/nginx rolled back
E0114 03:51:56.851317   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:56.970810   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:57.086230   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:57.208817   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:297: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Berror: unable to find specified revision 1000000 in history
E0114 03:51:57.852625   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:300: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0114 03:51:57.972165   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx rolled back
E0114 03:51:58.087399   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:58.209868   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:58.853804   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:58.973412   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:59.088881   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:304: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(BE0114 03:51:59.211202   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx paused
error: you cannot rollback a paused deployment; resume it first with 'kubectl rollout resume deployment/nginx' and try again
error: deployments.apps "nginx" can't restart paused deployment (run rollout resume first)
deployment.apps/nginx resumed
deployment.apps/nginx rolled back
E0114 03:51:59.855335   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:51:59.975057   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
    deployment.kubernetes.io/revision-history: 1,3
E0114 03:52:00.090224   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:00.212550   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
error: desired revision (3) is different from the running revision (5)
deployment.apps/nginx restarted
I0114 03:52:00.356748   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973910-8468", Name:"nginx", UID:"271c360d-2282-4058-b67a-d858db487e38", APIVersion:"apps/v1", ResourceVersion:"2203", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-f87d999f7 to 2
I0114 03:52:00.365015   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973910-8468", Name:"nginx", UID:"271c360d-2282-4058-b67a-d858db487e38", APIVersion:"apps/v1", ResourceVersion:"2206", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-78948dcb6b to 1
I0114 03:52:00.366591   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx-f87d999f7", UID:"9c718491-d6ab-4686-8ca1-da4c12be42ca", APIVersion:"apps/v1", ResourceVersion:"2207", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-f87d999f7-s4pcc
I0114 03:52:00.370308   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx-78948dcb6b", UID:"ef004dd6-1814-456d-a25f-e6a755583956", APIVersion:"apps/v1", ResourceVersion:"2210", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-78948dcb6b-tc6hs
E0114 03:52:00.856926   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:00.976574   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:01.091612   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:01.213893   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:apiVersion: apps/v1
kind: ReplicaSet
metadata:
  annotations:
    deployment.kubernetes.io/desired-replicas: "3"
... skipping 54 lines ...
I0114 03:52:01.616556   54689 horizontal.go:353] Horizontal Pod Autoscaler frontend has been deleted in namespace-1578973895-5741
deployment.apps/nginx2 created
I0114 03:52:01.797146   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973910-8468", Name:"nginx2", UID:"9dceef5f-12e6-489b-a122-48c44fc9b531", APIVersion:"apps/v1", ResourceVersion:"2226", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx2-57b7865cd9 to 3
I0114 03:52:01.801704   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx2-57b7865cd9", UID:"80255237-7b30-4b50-8d0e-595bb363f5af", APIVersion:"apps/v1", ResourceVersion:"2227", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-57b7865cd9-q8mcn
I0114 03:52:01.806457   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx2-57b7865cd9", UID:"80255237-7b30-4b50-8d0e-595bb363f5af", APIVersion:"apps/v1", ResourceVersion:"2227", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-57b7865cd9-mtxcv
I0114 03:52:01.806723   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx2-57b7865cd9", UID:"80255237-7b30-4b50-8d0e-595bb363f5af", APIVersion:"apps/v1", ResourceVersion:"2227", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-57b7865cd9-8f62k
E0114 03:52:01.858112   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx2" deleted
E0114 03:52:01.977923   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx" deleted
E0114 03:52:02.093244   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:334: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:52:02.215221   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment created
I0114 03:52:02.366712   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment", UID:"6827d117-b62b-4e1c-bf6f-9372980e381e", APIVersion:"apps/v1", ResourceVersion:"2260", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-598d4d68b4 to 3
I0114 03:52:02.372784   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment-598d4d68b4", UID:"3d1b7d1f-f07d-4362-8d6e-b7c7f2f606f4", APIVersion:"apps/v1", ResourceVersion:"2261", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-fqbxx
I0114 03:52:02.377879   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment-598d4d68b4", UID:"3d1b7d1f-f07d-4362-8d6e-b7c7f2f606f4", APIVersion:"apps/v1", ResourceVersion:"2261", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-vznj4
I0114 03:52:02.378020   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment-598d4d68b4", UID:"3d1b7d1f-f07d-4362-8d6e-b7c7f2f606f4", APIVersion:"apps/v1", ResourceVersion:"2261", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-tzh5c
apps.sh:337: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment:
(Bapps.sh:338: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:339: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(BE0114 03:52:02.859521   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment image updated
I0114 03:52:02.863961   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment", UID:"6827d117-b62b-4e1c-bf6f-9372980e381e", APIVersion:"apps/v1", ResourceVersion:"2276", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-59df9b5f5b to 1
I0114 03:52:02.868585   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment-59df9b5f5b", UID:"9f724ded-a74e-4ee1-9fa9-ef9bce5a045c", APIVersion:"apps/v1", ResourceVersion:"2277", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-59df9b5f5b-wwhbj
E0114 03:52:02.979218   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:342: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(BE0114 03:52:03.094628   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:343: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(BE0114 03:52:03.216635   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
error: unable to find container named "redis"
deployment.apps/nginx-deployment image updated
apps.sh:348: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:349: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment image updated
apps.sh:352: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(BE0114 03:52:03.861082   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:353: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(BE0114 03:52:03.980995   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:04.096076   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:356: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(BE0114 03:52:04.217868   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:357: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment image updated
I0114 03:52:04.445933   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment", UID:"6827d117-b62b-4e1c-bf6f-9372980e381e", APIVersion:"apps/v1", ResourceVersion:"2294", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-598d4d68b4 to 2
I0114 03:52:04.458560   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment-598d4d68b4", UID:"3d1b7d1f-f07d-4362-8d6e-b7c7f2f606f4", APIVersion:"apps/v1", ResourceVersion:"2298", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-598d4d68b4-fqbxx
I0114 03:52:04.460959   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment", UID:"6827d117-b62b-4e1c-bf6f-9372980e381e", APIVersion:"apps/v1", ResourceVersion:"2297", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-7d758dbc54 to 1
I0114 03:52:04.471070   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment-7d758dbc54", UID:"8d8c0987-20b2-4c1f-a032-79a0d51a55d1", APIVersion:"apps/v1", ResourceVersion:"2302", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-7d758dbc54-znmhf
apps.sh:360: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:361: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0114 03:52:04.865005   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:364: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0114 03:52:04.982403   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:05.097395   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:365: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bdeployment.apps "nginx-deployment" deleted
E0114 03:52:05.218821   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:371: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx-deployment created
I0114 03:52:05.577292   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment", UID:"b167382d-dbbd-40a0-a0aa-93409a165179", APIVersion:"apps/v1", ResourceVersion:"2329", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-598d4d68b4 to 3
I0114 03:52:05.581515   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment-598d4d68b4", UID:"3fd35335-4bce-4491-bfe4-32036f13cefa", APIVersion:"apps/v1", ResourceVersion:"2330", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-277lh
I0114 03:52:05.586187   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment-598d4d68b4", UID:"3fd35335-4bce-4491-bfe4-32036f13cefa", APIVersion:"apps/v1", ResourceVersion:"2330", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-djzjq
I0114 03:52:05.587050   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment-598d4d68b4", UID:"3fd35335-4bce-4491-bfe4-32036f13cefa", APIVersion:"apps/v1", ResourceVersion:"2330", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-fxsjf
configmap/test-set-env-config created
E0114 03:52:05.866538   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:05.984346   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret/test-set-env-secret created
E0114 03:52:06.098746   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:06.220306   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:376: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment:
(Bapps.sh:378: Successful get configmaps/test-set-env-config {{.metadata.name}}: test-set-env-config
(Bapps.sh:379: Successful get secret {{range.items}}{{.metadata.name}}:{{end}}: test-set-env-secret:
(Bdeployment.apps/nginx-deployment env updated
I0114 03:52:06.599135   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment", UID:"b167382d-dbbd-40a0-a0aa-93409a165179", APIVersion:"apps/v1", ResourceVersion:"2347", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6b9f7756b4 to 1
I0114 03:52:06.606439   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment-6b9f7756b4", UID:"8d092327-daac-47dd-8174-a8bfe1b1121f", APIVersion:"apps/v1", ResourceVersion:"2348", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6b9f7756b4-jkd49
apps.sh:383: Successful get deploy nginx-deployment {{ (index (index .spec.template.spec.containers 0).env 0).name}}: KEY_2
(Bapps.sh:385: Successful get deploy nginx-deployment {{ len (index .spec.template.spec.containers 0).env }}: 1
(BE0114 03:52:06.868092   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment env updated
I0114 03:52:06.981964   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment", UID:"b167382d-dbbd-40a0-a0aa-93409a165179", APIVersion:"apps/v1", ResourceVersion:"2357", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-598d4d68b4 to 2
E0114 03:52:06.985643   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 03:52:06.988905   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment-598d4d68b4", UID:"3fd35335-4bce-4491-bfe4-32036f13cefa", APIVersion:"apps/v1", ResourceVersion:"2361", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-598d4d68b4-277lh
I0114 03:52:06.995385   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment", UID:"b167382d-dbbd-40a0-a0aa-93409a165179", APIVersion:"apps/v1", ResourceVersion:"2359", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-754bf964c8 to 1
I0114 03:52:07.002698   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment-754bf964c8", UID:"0471e935-2c7b-45c3-92e5-1a5eeb90f4ef", APIVersion:"apps/v1", ResourceVersion:"2365", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-754bf964c8-9v4vc
E0114 03:52:07.100079   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:389: Successful get deploy nginx-deployment {{ len (index .spec.template.spec.containers 0).env }}: 2
(BE0114 03:52:07.221380   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment env updated
I0114 03:52:07.253721   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment", UID:"b167382d-dbbd-40a0-a0aa-93409a165179", APIVersion:"apps/v1", ResourceVersion:"2377", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-598d4d68b4 to 1
I0114 03:52:07.259577   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment-598d4d68b4", UID:"3fd35335-4bce-4491-bfe4-32036f13cefa", APIVersion:"apps/v1", ResourceVersion:"2381", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-598d4d68b4-djzjq
I0114 03:52:07.261484   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment", UID:"b167382d-dbbd-40a0-a0aa-93409a165179", APIVersion:"apps/v1", ResourceVersion:"2380", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-c6d5c5c7b to 1
I0114 03:52:07.268226   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment-c6d5c5c7b", UID:"314de070-7212-43ec-98b9-744049c240f1", APIVersion:"apps/v1", ResourceVersion:"2385", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-c6d5c5c7b-62scl
deployment.apps/nginx-deployment env updated
... skipping 5 lines ...
I0114 03:52:07.531913   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment", UID:"b167382d-dbbd-40a0-a0aa-93409a165179", APIVersion:"apps/v1", ResourceVersion:"2417", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-6b9f7756b4 to 0
I0114 03:52:07.538960   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment", UID:"b167382d-dbbd-40a0-a0aa-93409a165179", APIVersion:"apps/v1", ResourceVersion:"2419", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-98b7fd455 to 1
deployment.apps/nginx-deployment env updated
I0114 03:52:07.657652   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment-6b9f7756b4", UID:"8d092327-daac-47dd-8174-a8bfe1b1121f", APIVersion:"apps/v1", ResourceVersion:"2420", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-6b9f7756b4-jkd49
deployment.apps/nginx-deployment env updated
deployment.apps "nginx-deployment" deleted
E0114 03:52:07.869103   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:07.987026   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
configmap "test-set-env-config" deleted
E0114 03:52:08.101417   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret "test-set-env-secret" deleted
+++ exit code: 0
E0114 03:52:08.153191   54689 replica_set.go:534] sync "namespace-1578973910-8468/nginx-deployment-5958f7687" failed with replicasets.apps "nginx-deployment-5958f7687" not found
I0114 03:52:08.204114   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973910-8468", Name:"nginx-deployment-98b7fd455", UID:"c99c2162-806d-40ad-900f-fdde4c28ff62", APIVersion:"apps/v1", ResourceVersion:"2422", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-98b7fd455-g8jws
Recording: run_rs_tests
Running command: run_rs_tests
E0114 03:52:08.222782   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource

+++ Running case: test-cmd.run_rs_tests 
E0114 03:52:08.252087   54689 replica_set.go:534] sync "namespace-1578973910-8468/nginx-deployment-6b9f7756b4" failed with replicasets.apps "nginx-deployment-6b9f7756b4" not found
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_rs_tests
+++ [0114 03:52:08] Creating namespace namespace-1578973928-5397
namespace/namespace-1578973928-5397 created
E0114 03:52:08.401986   54689 replica_set.go:534] sync "namespace-1578973910-8468/nginx-deployment-98b7fd455" failed with replicasets.apps "nginx-deployment-98b7fd455" not found
Context "test" modified.
+++ [0114 03:52:08] Testing kubectl(v1:replicasets)
apps.sh:511: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicaset.apps/frontend created
I0114 03:52:08.780061   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973928-5397", Name:"frontend", UID:"63e26eb4-666c-4eb1-95e5-b927fcf4bb65", APIVersion:"apps/v1", ResourceVersion:"2462", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-642rj
I0114 03:52:08.783200   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973928-5397", Name:"frontend", UID:"63e26eb4-666c-4eb1-95e5-b927fcf4bb65", APIVersion:"apps/v1", ResourceVersion:"2462", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-g95xc
I0114 03:52:08.784283   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973928-5397", Name:"frontend", UID:"63e26eb4-666c-4eb1-95e5-b927fcf4bb65", APIVersion:"apps/v1", ResourceVersion:"2462", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-ww9kf
+++ [0114 03:52:08] Deleting rs
E0114 03:52:08.870556   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps "frontend" deleted
E0114 03:52:08.903029   54689 replica_set.go:534] sync "namespace-1578973928-5397/frontend" failed with Operation cannot be fulfilled on replicasets.apps "frontend": StorageError: invalid object, Code: 4, Key: /registry/replicasets/namespace-1578973928-5397/frontend, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 63e26eb4-666c-4eb1-95e5-b927fcf4bb65, UID in object meta: 
E0114 03:52:08.988711   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:517: Successful get pods -l "tier=frontend" {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:52:09.102733   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:521: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:52:09.224038   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend created
I0114 03:52:09.320934   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973928-5397", Name:"frontend", UID:"e5509eeb-bf54-48fa-8ac2-a5b8dedb14ad", APIVersion:"apps/v1", ResourceVersion:"2477", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-xqvvj
I0114 03:52:09.324237   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973928-5397", Name:"frontend", UID:"e5509eeb-bf54-48fa-8ac2-a5b8dedb14ad", APIVersion:"apps/v1", ResourceVersion:"2477", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-j2gkw
I0114 03:52:09.324272   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973928-5397", Name:"frontend", UID:"e5509eeb-bf54-48fa-8ac2-a5b8dedb14ad", APIVersion:"apps/v1", ResourceVersion:"2477", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-t86zq
apps.sh:525: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
(B+++ [0114 03:52:09] Deleting rs
replicaset.apps "frontend" deleted
E0114 03:52:09.601791   54689 replica_set.go:534] sync "namespace-1578973928-5397/frontend" failed with replicasets.apps "frontend" not found
apps.sh:529: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BI0114 03:52:09.732379   54689 horizontal.go:353] Horizontal Pod Autoscaler nginx-deployment has been deleted in namespace-1578973910-8468
apps.sh:531: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
(BE0114 03:52:09.871934   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod "frontend-j2gkw" deleted
pod "frontend-t86zq" deleted
pod "frontend-xqvvj" deleted
E0114 03:52:09.990008   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:534: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:52:10.104087   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:538: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:52:10.225247   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend created
I0114 03:52:10.333041   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973928-5397", Name:"frontend", UID:"3e76c8e0-8a8b-431c-ad5f-900a883c08fe", APIVersion:"apps/v1", ResourceVersion:"2500", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-rv9xj
I0114 03:52:10.336811   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973928-5397", Name:"frontend", UID:"3e76c8e0-8a8b-431c-ad5f-900a883c08fe", APIVersion:"apps/v1", ResourceVersion:"2500", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-b69pz
I0114 03:52:10.336863   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973928-5397", Name:"frontend", UID:"3e76c8e0-8a8b-431c-ad5f-900a883c08fe", APIVersion:"apps/v1", ResourceVersion:"2500", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-6w5p6
apps.sh:542: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Bmatched Name:
... skipping 8 lines ...
Namespace:    namespace-1578973928-5397
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
Namespace:    namespace-1578973928-5397
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 18 lines ...
Namespace:    namespace-1578973928-5397
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 4 lines ...
      memory:  100Mi
    Environment:
      GET_HOSTS_FROM:  dns
    Mounts:            <none>
  Volumes:             <none>
(B
E0114 03:52:10.873056   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:550: Successful describe
Name:         frontend
Namespace:    namespace-1578973928-5397
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 10 lines ...
  Type    Reason            Age   From                   Message
  ----    ------            ----  ----                   -------
  Normal  SuccessfulCreate  0s    replicaset-controller  Created pod: frontend-rv9xj
  Normal  SuccessfulCreate  0s    replicaset-controller  Created pod: frontend-b69pz
  Normal  SuccessfulCreate  0s    replicaset-controller  Created pod: frontend-6w5p6
(B
E0114 03:52:10.991141   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:11.105169   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Pod Template:
matched Labels:
matched Selector:
matched Replicas:
matched Pods Status:
... skipping 3 lines ...
Namespace:    namespace-1578973928-5397
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 9 lines ...
Events:
  Type    Reason            Age   From                   Message
  ----    ------            ----  ----                   -------
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-rv9xj
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-b69pz
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-6w5p6
(BE0114 03:52:11.226388   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:         frontend
Namespace:    namespace-1578973928-5397
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
Namespace:    namespace-1578973928-5397
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 11 lines ...
Namespace:    namespace-1578973928-5397
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 104 lines ...
Node-Selectors:        <none>
Tolerations:           <none>
Events:                <none>
(Bapps.sh:564: Successful get rs frontend {{.spec.replicas}}: 3
(Breplicaset.apps/frontend scaled
E0114 03:52:11.870271   54689 replica_set.go:199] ReplicaSet has no controller: &ReplicaSet{ObjectMeta:{frontend  namespace-1578973928-5397 /apis/apps/v1/namespaces/namespace-1578973928-5397/replicasets/frontend 3e76c8e0-8a8b-431c-ad5f-900a883c08fe 2511 2 2020-01-14 03:52:10 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  [{kubectl Update apps/v1 2020-01-14 03:52:10 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 109 101 116 97 100 97 116 97 34 58 123 34 102 58 108 97 98 101 108 115 34 58 123 34 102 58 97 112 112 34 58 123 125 44 34 102 58 116 105 101 114 34 58 123 125 44 34 46 34 58 123 125 125 125 44 34 102 58 115 112 101 99 34 58 123 34 102 58 114 101 112 108 105 99 97 115 34 58 123 125 44 34 102 58 115 101 108 101 99 116 111 114 34 58 123 34 102 58 109 97 116 99 104 76 97 98 101 108 115 34 58 123 34 102 58 97 112 112 34 58 123 125 44 34 102 58 116 105 101 114 34 58 123 125 44 34 46 34 58 123 125 125 125 44 34 102 58 116 101 109 112 108 97 116 101 34 58 123 34 102 58 109 101 116 97 100 97 116 97 34 58 123 34 102 58 108 97 98 101 108 115 34 58 123 34 102 58 97 112 112 34 58 123 125 44 34 102 58 116 105 101 114 34 58 123 125 44 34 46 34 58 123 125 125 125 44 34 102 58 115 112 101 99 34 58 123 34 102 58 99 111 110 116 97 105 110 101 114 115 34 58 123 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 112 104 112 45 114 101 100 105 115 92 34 125 34 58 123 34 102 58 101 110 118 34 58 123 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 71 69 84 95 72 79 83 84 83 95 70 82 79 77 92 34 125 34 58 123 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 105 109 97 103 101 34 58 123 125 44 34 102 58 105 109 97 103 101 80 117 108 108 80 111 108 105 99 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 112 111 114 116 115 34 58 123 34 107 58 123 92 34 99 111 110 116 97 105 110 101 114 80 111 114 116 92 34 58 56 48 44 92 34 112 114 111 116 111 99 111 108 92 34 58 92 34 84 67 80 92 34 125 34 58 123 34 102 58 99 111 110 116 97 105 110 101 114 80 111 114 116 34 58 123 125 44 34 102 58 112 114 111 116 111 99 111 108 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 114 101 115 111 117 114 99 101 115 34 58 123 34 102 58 114 101 113 117 101 115 116 115 34 58 123 34 102 58 99 112 117 34 58 123 125 44 34 102 58 109 101 109 111 114 121 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 97 116 104 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 111 108 105 99 121 34 58 123 125 44 34 46 34 58 123 125 125 125 44 34 102 58 100 110 115 80 111 108 105 99 121 34 58 123 125 44 34 102 58 114 101 115 116 97 114 116 80 111 108 105 99 121 34 58 123 125 44 34 102 58 115 99 104 101 100 117 108 101 114 78 97 109 101 34 58 123 125 44 34 102 58 115 101 99 117 114 105 116 121 67 111 110 116 101 120 116 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 71 114 97 99 101 80 101 114 105 111 100 83 101 99 111 110 100 115 34 58 123 125 125 125 125 125],}} {kube-controller-manager Update apps/v1 2020-01-14 03:52:10 +0000 UTC FieldsV1 &FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 102 117 108 108 121 76 97 98 101 108 101 100 82 101 112 108 105 99 97 115 34 58 123 125 44 34 102 58 111 98 115 101 114 118 101 100 71 101 110 101 114 97 116 105 111 110 34 58 123 125 44 34 102 58 114 101 112 108 105 99 97 115 34 58 123 125 125 125],}}]},Spec:ReplicaSetSpec{Replicas:*2,Selector:&v1.LabelSelector{MatchLabels:map[string]string{app: guestbook,tier: frontend,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{      0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []} {[] [] [{php-redis gcr.io/google_samples/gb-frontend:v3 [] []  [{ 0 80 TCP }] [] [{GET_HOSTS_FROM dns nil}] {map[] map[cpu:{{100 -3} {<nil>} 100m DecimalSI} memory:{{104857600 0} {<nil>} 100Mi BinarySI}]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent nil false false false}] [] Always 0xc0031a71f8 <nil> ClusterFirst map[]   <nil>  false false false <nil> PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,} []   nil default-scheduler [] []  <nil> nil [] <nil> <nil> <nil> map[] []}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:3,FullyLabeledReplicas:3,ObservedGeneration:1,ReadyReplicas:0,AvailableReplicas:0,Conditions:[]ReplicaSetCondition{},},}
E0114 03:52:11.874682   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 03:52:11.877197   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973928-5397", Name:"frontend", UID:"3e76c8e0-8a8b-431c-ad5f-900a883c08fe", APIVersion:"apps/v1", ResourceVersion:"2511", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-rv9xj
apps.sh:568: Successful get rs frontend {{.spec.replicas}}: 2
(BE0114 03:52:11.992505   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:12.107111   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/scale-1 created
I0114 03:52:12.190976   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973928-5397", Name:"scale-1", UID:"cb1b89b2-fd8f-418f-a28e-72266e3ff5e5", APIVersion:"apps/v1", ResourceVersion:"2517", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-1-5c5565bcd9 to 1
I0114 03:52:12.197489   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973928-5397", Name:"scale-1-5c5565bcd9", UID:"ebd70c47-374e-467d-88ef-05ca72abaa3b", APIVersion:"apps/v1", ResourceVersion:"2518", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-1-5c5565bcd9-kdvmb
E0114 03:52:12.227866   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/scale-2 created
I0114 03:52:12.428736   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973928-5397", Name:"scale-2", UID:"beb7ef88-5454-49ca-81ab-7424c17c6b31", APIVersion:"apps/v1", ResourceVersion:"2527", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-2-5c5565bcd9 to 1
I0114 03:52:12.441698   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973928-5397", Name:"scale-2-5c5565bcd9", UID:"a7e5f009-d542-4857-bfc8-0402a9cd9214", APIVersion:"apps/v1", ResourceVersion:"2528", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-2-5c5565bcd9-vhxpf
deployment.apps/scale-3 created
I0114 03:52:12.648366   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973928-5397", Name:"scale-3", UID:"7ce4fe9c-7ac8-4716-902c-e7988177cf6c", APIVersion:"apps/v1", ResourceVersion:"2539", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-3-5c5565bcd9 to 1
I0114 03:52:12.652453   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973928-5397", Name:"scale-3-5c5565bcd9", UID:"4d92bf91-d78c-4683-afaa-1c71381bdf3a", APIVersion:"apps/v1", ResourceVersion:"2540", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-5c5565bcd9-cts5s
apps.sh:574: Successful get deploy scale-1 {{.spec.replicas}}: 1
(BE0114 03:52:12.876439   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:575: Successful get deploy scale-2 {{.spec.replicas}}: 1
(BE0114 03:52:12.993639   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:576: Successful get deploy scale-3 {{.spec.replicas}}: 1
(Bdeployment.apps/scale-1 scaled
I0114 03:52:13.105994   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973928-5397", Name:"scale-1", UID:"cb1b89b2-fd8f-418f-a28e-72266e3ff5e5", APIVersion:"apps/v1", ResourceVersion:"2549", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-1-5c5565bcd9 to 2
deployment.apps/scale-2 scaled
I0114 03:52:13.109086   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973928-5397", Name:"scale-1-5c5565bcd9", UID:"ebd70c47-374e-467d-88ef-05ca72abaa3b", APIVersion:"apps/v1", ResourceVersion:"2550", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-1-5c5565bcd9-8gzkf
E0114 03:52:13.109542   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 03:52:13.114012   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973928-5397", Name:"scale-2", UID:"beb7ef88-5454-49ca-81ab-7424c17c6b31", APIVersion:"apps/v1", ResourceVersion:"2551", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-2-5c5565bcd9 to 2
I0114 03:52:13.119854   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973928-5397", Name:"scale-2-5c5565bcd9", UID:"a7e5f009-d542-4857-bfc8-0402a9cd9214", APIVersion:"apps/v1", ResourceVersion:"2557", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-2-5c5565bcd9-vch2j
apps.sh:579: Successful get deploy scale-1 {{.spec.replicas}}: 2
(BE0114 03:52:13.229218   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:580: Successful get deploy scale-2 {{.spec.replicas}}: 2
(Bapps.sh:581: Successful get deploy scale-3 {{.spec.replicas}}: 1
(Bdeployment.apps/scale-1 scaled
deployment.apps/scale-2 scaled
deployment.apps/scale-3 scaled
I0114 03:52:13.572273   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973928-5397", Name:"scale-1", UID:"cb1b89b2-fd8f-418f-a28e-72266e3ff5e5", APIVersion:"apps/v1", ResourceVersion:"2569", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-1-5c5565bcd9 to 3
... skipping 2 lines ...
I0114 03:52:13.578124   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973928-5397", Name:"scale-1-5c5565bcd9", UID:"ebd70c47-374e-467d-88ef-05ca72abaa3b", APIVersion:"apps/v1", ResourceVersion:"2572", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-1-5c5565bcd9-66x7s
I0114 03:52:13.581986   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973928-5397", Name:"scale-3-5c5565bcd9", UID:"4d92bf91-d78c-4683-afaa-1c71381bdf3a", APIVersion:"apps/v1", ResourceVersion:"2574", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-5c5565bcd9-vcwsq
I0114 03:52:13.583417   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973928-5397", Name:"scale-2-5c5565bcd9", UID:"a7e5f009-d542-4857-bfc8-0402a9cd9214", APIVersion:"apps/v1", ResourceVersion:"2573", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-2-5c5565bcd9-tnqmx
I0114 03:52:13.586305   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973928-5397", Name:"scale-3-5c5565bcd9", UID:"4d92bf91-d78c-4683-afaa-1c71381bdf3a", APIVersion:"apps/v1", ResourceVersion:"2574", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-5c5565bcd9-vv46g
apps.sh:584: Successful get deploy scale-1 {{.spec.replicas}}: 3
(Bapps.sh:585: Successful get deploy scale-2 {{.spec.replicas}}: 3
(BE0114 03:52:13.877580   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:586: Successful get deploy scale-3 {{.spec.replicas}}: 3
(BE0114 03:52:13.994816   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps "frontend" deleted
deployment.apps "scale-1" deleted
E0114 03:52:14.113277   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "scale-2" deleted
deployment.apps "scale-3" deleted
E0114 03:52:14.230351   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend created
I0114 03:52:14.376261   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973928-5397", Name:"frontend", UID:"31aace2d-f0c3-4bc2-b225-99e585557001", APIVersion:"apps/v1", ResourceVersion:"2630", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-xtn9r
I0114 03:52:14.383500   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973928-5397", Name:"frontend", UID:"31aace2d-f0c3-4bc2-b225-99e585557001", APIVersion:"apps/v1", ResourceVersion:"2630", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-jz858
I0114 03:52:14.383545   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973928-5397", Name:"frontend", UID:"31aace2d-f0c3-4bc2-b225-99e585557001", APIVersion:"apps/v1", ResourceVersion:"2630", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-wvf6f
apps.sh:594: Successful get rs frontend {{.spec.replicas}}: 3
(Bservice/frontend exposed
apps.sh:598: Successful get service frontend {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(BE0114 03:52:14.879329   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/frontend-2 exposed
E0114 03:52:14.997516   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:15.115185   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:602: Successful get service frontend-2 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: default 80
(BE0114 03:52:15.231850   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "frontend" deleted
service "frontend-2" deleted
apps.sh:608: Successful get rs frontend {{.metadata.generation}}: 1
(Breplicaset.apps/frontend image updated
apps.sh:610: Successful get rs frontend {{.metadata.generation}}: 2
(Breplicaset.apps/frontend env updated
E0114 03:52:15.880616   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:612: Successful get rs frontend {{.metadata.generation}}: 3
(BE0114 03:52:15.998722   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend resource requirements updated
E0114 03:52:16.116656   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:16.233363   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:614: Successful get rs frontend {{.metadata.generation}}: 4
(Bapps.sh:618: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Breplicaset.apps "frontend" deleted
apps.sh:622: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:626: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:52:16.881854   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:17.000453   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend created
E0114 03:52:17.117624   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 03:52:17.120791   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973928-5397", Name:"frontend", UID:"94603ef3-4875-48c7-91bb-40fddc2c4832", APIVersion:"apps/v1", ResourceVersion:"2668", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-vmkr7
I0114 03:52:17.126845   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973928-5397", Name:"frontend", UID:"94603ef3-4875-48c7-91bb-40fddc2c4832", APIVersion:"apps/v1", ResourceVersion:"2668", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-99bzf
I0114 03:52:17.127615   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973928-5397", Name:"frontend", UID:"94603ef3-4875-48c7-91bb-40fddc2c4832", APIVersion:"apps/v1", ResourceVersion:"2668", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-5rrpw
E0114 03:52:17.234561   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/redis-slave created
I0114 03:52:17.405783   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973928-5397", Name:"redis-slave", UID:"79359861-6177-402a-bb12-7ac3b0b00fcd", APIVersion:"apps/v1", ResourceVersion:"2677", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-tgfd9
I0114 03:52:17.413344   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973928-5397", Name:"redis-slave", UID:"79359861-6177-402a-bb12-7ac3b0b00fcd", APIVersion:"apps/v1", ResourceVersion:"2677", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-mktjq
apps.sh:631: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(Bapps.sh:635: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(Breplicaset.apps "frontend" deleted
replicaset.apps "redis-slave" deleted
E0114 03:52:17.883022   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:639: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:52:18.001939   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:18.118775   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:644: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:52:18.236266   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend created
I0114 03:52:18.396710   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973928-5397", Name:"frontend", UID:"ad04c25e-edbe-4920-aba5-cfe22312c7cf", APIVersion:"apps/v1", ResourceVersion:"2696", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-ltf77
I0114 03:52:18.399357   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973928-5397", Name:"frontend", UID:"ad04c25e-edbe-4920-aba5-cfe22312c7cf", APIVersion:"apps/v1", ResourceVersion:"2696", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-b56c8
I0114 03:52:18.400458   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973928-5397", Name:"frontend", UID:"ad04c25e-edbe-4920-aba5-cfe22312c7cf", APIVersion:"apps/v1", ResourceVersion:"2696", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-4xz2c
apps.sh:647: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Bhorizontalpodautoscaler.autoscaling/frontend autoscaled
apps.sh:650: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 70
(BE0114 03:52:18.884337   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
horizontalpodautoscaler.autoscaling "frontend" deleted
horizontalpodautoscaler.autoscaling/frontend autoscaled
E0114 03:52:19.003130   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:654: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(BE0114 03:52:19.120334   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
horizontalpodautoscaler.autoscaling "frontend" deleted
E0114 03:52:19.237516   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Error: required flag(s) "max" not set
replicaset.apps "frontend" deleted
+++ exit code: 0
Recording: run_stateful_set_tests
Running command: run_stateful_set_tests

+++ Running case: test-cmd.run_stateful_set_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_stateful_set_tests
+++ [0114 03:52:19] Creating namespace namespace-1578973939-4145
namespace/namespace-1578973939-4145 created
Context "test" modified.
+++ [0114 03:52:19] Testing kubectl(v1:statefulsets)
apps.sh:470: Successful get statefulset {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:52:19.885632   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:20.004483   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 03:52:20.049143   51249 controller.go:606] quota admission added evaluator for: statefulsets.apps
statefulset.apps/nginx created
E0114 03:52:20.121558   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:476: Successful get statefulset nginx {{.spec.replicas}}: 0
(BE0114 03:52:20.238631   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:477: Successful get statefulset nginx {{.status.observedGeneration}}: 1
(Bstatefulset.apps/nginx scaled
I0114 03:52:20.449755   54689 event.go:278] Event(v1.ObjectReference{Kind:"StatefulSet", Namespace:"namespace-1578973939-4145", Name:"nginx", UID:"a098cc1c-5274-4075-ae3c-798ff284e4b7", APIVersion:"apps/v1", ResourceVersion:"2724", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' create Pod nginx-0 in StatefulSet nginx successful
apps.sh:481: Successful get statefulset nginx {{.spec.replicas}}: 1
(Bapps.sh:482: Successful get statefulset nginx {{.status.observedGeneration}}: 2
(BE0114 03:52:20.886900   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps/nginx restarted
E0114 03:52:21.005694   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:490: Successful get statefulset nginx {{.status.observedGeneration}}: 3
(BE0114 03:52:21.122806   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps "nginx" deleted
I0114 03:52:21.136167   54689 stateful_set.go:420] StatefulSet has been deleted namespace-1578973939-4145/nginx
E0114 03:52:21.239972   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_statefulset_history_tests
Running command: run_statefulset_history_tests

+++ Running case: test-cmd.run_statefulset_history_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_statefulset_history_tests
+++ [0114 03:52:21] Creating namespace namespace-1578973941-15670
namespace/namespace-1578973941-15670 created
Context "test" modified.
+++ [0114 03:52:21] Testing kubectl(v1:statefulsets, v1:controllerrevisions)
apps.sh:418: Successful get statefulset {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:52:21.888337   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps/nginx created
E0114 03:52:22.006735   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:422: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"StatefulSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"app":"nginx-statefulset"},"name":"nginx","namespace":"namespace-1578973941-15670"},"spec":{"replicas":0,"selector":{"matchLabels":{"app":"nginx-statefulset"}},"serviceName":"nginx","template":{"metadata":{"labels":{"app":"nginx-statefulset"}},"spec":{"containers":[{"command":["sh","-c","while true; do sleep 1; done"],"image":"k8s.gcr.io/nginx-slim:0.7","name":"nginx","ports":[{"containerPort":80,"name":"web"}]}],"terminationGracePeriodSeconds":5}},"updateStrategy":{"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
(BE0114 03:52:22.124040   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps/nginx skipped rollback (current template already matches revision 1)
E0114 03:52:22.241192   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:425: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(Bapps.sh:426: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(Bstatefulset.apps/nginx configured
apps.sh:429: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
(Bapps.sh:430: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(BE0114 03:52:22.889633   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:431: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(BE0114 03:52:23.008244   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:432: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"StatefulSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"app":"nginx-statefulset"},"name":"nginx","namespace":"namespace-1578973941-15670"},"spec":{"replicas":0,"selector":{"matchLabels":{"app":"nginx-statefulset"}},"serviceName":"nginx","template":{"metadata":{"labels":{"app":"nginx-statefulset"}},"spec":{"containers":[{"command":["sh","-c","while true; do sleep 1; done"],"image":"k8s.gcr.io/nginx-slim:0.7","name":"nginx","ports":[{"containerPort":80,"name":"web"}]}],"terminationGracePeriodSeconds":5}},"updateStrategy":{"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"StatefulSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-statefulset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"app":"nginx-statefulset"},"name":"nginx","namespace":"namespace-1578973941-15670"},"spec":{"replicas":0,"selector":{"matchLabels":{"app":"nginx-statefulset"}},"serviceName":"nginx","template":{"metadata":{"labels":{"app":"nginx-statefulset"}},"spec":{"containers":[{"command":["sh","-c","while true; do sleep 1; done"],"image":"k8s.gcr.io/nginx-slim:0.8","name":"nginx","ports":[{"containerPort":80,"name":"web"}]},{"image":"k8s.gcr.io/pause:2.0","name":"pause","ports":[{"containerPort":81,"name":"web-2"}]}],"terminationGracePeriodSeconds":5}},"updateStrategy":{"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-statefulset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
(BE0114 03:52:23.125177   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps/nginx will roll back to Pod Template:
  Labels:	app=nginx-statefulset
  Containers:
   nginx:
    Image:	k8s.gcr.io/nginx-slim:0.7
    Port:	80/TCP
... skipping 3 lines ...
      -c
      while true; do sleep 1; done
    Environment:	<none>
    Mounts:	<none>
  Volumes:	<none>
 (dry run)
E0114 03:52:23.242368   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:435: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
(Bapps.sh:436: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:437: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bstatefulset.apps/nginx rolled back
apps.sh:440: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(BE0114 03:52:23.890893   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:441: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BE0114 03:52:24.011487   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: unable to find specified revision 1000000 in history
has:unable to find specified revision
E0114 03:52:24.126414   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:445: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(BE0114 03:52:24.243596   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:446: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(Bstatefulset.apps/nginx rolled back
apps.sh:449: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
(Bapps.sh:450: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:451: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(BE0114 03:52:24.892368   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps "nginx" deleted
I0114 03:52:24.988883   54689 stateful_set.go:420] StatefulSet has been deleted namespace-1578973941-15670/nginx
E0114 03:52:25.012891   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:25.127830   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_lists_tests
Running command: run_lists_tests

+++ Running case: test-cmd.run_lists_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_lists_tests
E0114 03:52:25.245092   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ [0114 03:52:25] Creating namespace namespace-1578973945-16611
namespace/namespace-1578973945-16611 created
Context "test" modified.
+++ [0114 03:52:25] Testing kubectl(v1:lists)
service/list-service-test created
deployment.apps/list-deployment-test created
I0114 03:52:25.673001   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973945-16611", Name:"list-deployment-test", UID:"4799e15c-a8d7-40b3-a814-28f158ae3ef6", APIVersion:"apps/v1", ResourceVersion:"2765", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set list-deployment-test-7cd8c5ff6d to 1
I0114 03:52:25.679401   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973945-16611", Name:"list-deployment-test-7cd8c5ff6d", UID:"4fd9ea3b-4c9c-4c4f-b671-6c96ba0422f9", APIVersion:"apps/v1", ResourceVersion:"2766", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: list-deployment-test-7cd8c5ff6d-5fh4j
service "list-service-test" deleted
deployment.apps "list-deployment-test" deleted
+++ exit code: 0
Recording: run_multi_resources_tests
Running command: run_multi_resources_tests
E0114 03:52:25.893608   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource

+++ Running case: test-cmd.run_multi_resources_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_multi_resources_tests
+++ [0114 03:52:25] Creating namespace namespace-1578973945-6740
E0114 03:52:26.014173   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578973945-6740 created
Context "test" modified.
+++ [0114 03:52:26] Testing kubectl(v1:multiple resources)
Testing with file hack/testdata/multi-resource-yaml.yaml and replace with file hack/testdata/multi-resource-yaml-modify.yaml
E0114 03:52:26.129040   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:26.246205   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Bservice/mock created
replicationcontroller/mock created
I0114 03:52:26.600882   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973945-6740", Name:"mock", UID:"5b530942-fb01-495d-a37b-1cd527f04db1", APIVersion:"v1", ResourceVersion:"2787", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-dv498
generic-resources.sh:72: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(Bgeneric-resources.sh:80: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BE0114 03:52:26.894768   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
NAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
service/mock   ClusterIP   10.0.0.202   <none>        99/TCP    0s

NAME                         DESIRED   CURRENT   READY   AGE
replicationcontroller/mock   1         1         0       0s
E0114 03:52:27.015379   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:27.130046   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Name:              mock
Namespace:         namespace-1578973945-6740
Labels:            app=mock
Annotations:       <none>
Selector:          app=mock
Type:              ClusterIP
... skipping 8 lines ...
Name:         mock
Namespace:    namespace-1578973945-6740
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 2 lines ...
    Mounts:       <none>
  Volumes:        <none>
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: mock-dv498
E0114 03:52:27.247308   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock" deleted
replicationcontroller "mock" deleted
service/mock replaced
replicationcontroller/mock replaced
I0114 03:52:27.386675   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973945-6740", Name:"mock", UID:"5b5310b1-c861-4acf-8486-38675f68078b", APIVersion:"v1", ResourceVersion:"2803", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-ncsxv
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(Bgeneric-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(Bservice/mock edited
replicationcontroller/mock edited
E0114 03:52:27.896067   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(BE0114 03:52:28.016550   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(BE0114 03:52:28.131288   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock labeled
replicationcontroller/mock labeled
E0114 03:52:28.248712   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(Bgeneric-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(Bservice/mock annotated
replicationcontroller/mock annotated
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(Bgeneric-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(Bservice "mock" deleted
replicationcontroller "mock" deleted
Testing with file hack/testdata/multi-resource-list.json and replace with file hack/testdata/multi-resource-list-modify.json
E0114 03:52:28.897158   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:52:29.017686   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:52:29.132810   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:29.250334   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock created
replicationcontroller/mock created
I0114 03:52:29.326259   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973945-6740", Name:"mock", UID:"d048cb6e-d3aa-4405-8c80-dc86faf9357e", APIVersion:"v1", ResourceVersion:"2827", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-6f92s
generic-resources.sh:72: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(Bgeneric-resources.sh:80: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BNAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
... skipping 18 lines ...
Name:         mock
Namespace:    namespace-1578973945-6740
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 2 lines ...
    Mounts:       <none>
  Volumes:        <none>
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: mock-6f92s
E0114 03:52:29.898385   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:30.018875   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock" deleted
replicationcontroller "mock" deleted
service/mock replaced
replicationcontroller/mock replaced
I0114 03:52:30.105988   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973945-6740", Name:"mock", UID:"52530fe8-90dc-4227-9bbc-20af05d35513", APIVersion:"v1", ResourceVersion:"2843", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-ngmjf
E0114 03:52:30.133971   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(BE0114 03:52:30.251841   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(Bservice/mock edited
replicationcontroller/mock edited
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(BE0114 03:52:30.899447   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock labeled
replicationcontroller/mock labeled
E0114 03:52:31.020160   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(BE0114 03:52:31.135143   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(Bservice/mock annotated
replicationcontroller/mock annotated
E0114 03:52:31.252968   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(Bgeneric-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(Bservice "mock" deleted
replicationcontroller "mock" deleted
Testing with file hack/testdata/multi-resource-json.json and replace with file hack/testdata/multi-resource-json-modify.json
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:52:31.900728   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:32.021410   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock created
replicationcontroller/mock created
I0114 03:52:32.074038   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973945-6740", Name:"mock", UID:"bda5a37b-1377-49d4-a325-8c04e1c4e530", APIVersion:"v1", ResourceVersion:"2867", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-8stmp
E0114 03:52:32.136420   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:72: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BE0114 03:52:32.254254   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:80: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BNAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
service/mock   ClusterIP   10.0.0.140   <none>        99/TCP    0s

NAME                         DESIRED   CURRENT   READY   AGE
replicationcontroller/mock   1         1         0       0s
... skipping 14 lines ...
Name:         mock
Namespace:    namespace-1578973945-6740
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 7 lines ...
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: mock-8stmp
service "mock" deleted
replicationcontroller "mock" deleted
service/mock replaced
replicationcontroller/mock replaced
I0114 03:52:32.838025   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973945-6740", Name:"mock", UID:"764f3f2e-8f7f-45b4-a1bb-de73b6fefd74", APIVersion:"v1", ResourceVersion:"2883", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-lbmvj
E0114 03:52:32.902006   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(BE0114 03:52:33.022620   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(BE0114 03:52:33.137618   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:33.255459   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock edited
replicationcontroller/mock edited
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(BI0114 03:52:33.642163   54689 horizontal.go:353] Horizontal Pod Autoscaler frontend has been deleted in namespace-1578973928-5397
service/mock labeled
replicationcontroller/mock labeled
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(Bgeneric-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(BE0114 03:52:33.903103   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock annotated
replicationcontroller/mock annotated
E0114 03:52:34.024052   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(BE0114 03:52:34.138875   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(BE0114 03:52:34.258608   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock" deleted
replicationcontroller "mock" deleted
Testing with file hack/testdata/multi-resource-rclist.json and replace with file hack/testdata/multi-resource-rclist-modify.json
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/mock created
replicationcontroller/mock2 created
I0114 03:52:34.825930   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973945-6740", Name:"mock", UID:"bcd55b3e-2bba-4572-827d-95db39de654a", APIVersion:"v1", ResourceVersion:"2904", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-rn7ds
I0114 03:52:34.828432   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973945-6740", Name:"mock2", UID:"27b366cc-836a-46d1-b441-d09f6e5bc8c9", APIVersion:"v1", ResourceVersion:"2905", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock2-tfs9n
E0114 03:52:34.904464   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:78: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:mock2:
(BE0114 03:52:35.025173   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
NAME    DESIRED   CURRENT   READY   AGE
mock    1         1         0       1s
mock2   1         1         0       1s
E0114 03:52:35.140236   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:35.260507   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Name:         mock
Namespace:    namespace-1578973945-6740
Selector:     app=mock
Labels:       app=mock
              status=replaced
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 11 lines ...
Namespace:    namespace-1578973945-6740
Selector:     app=mock2
Labels:       app=mock2
              status=replaced
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock2
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 10 lines ...
replicationcontroller/mock replaced
I0114 03:52:35.554654   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973945-6740", Name:"mock", UID:"6a903a57-e868-47d6-969d-51151179c236", APIVersion:"v1", ResourceVersion:"2920", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-2hpdt
replicationcontroller/mock2 replaced
I0114 03:52:35.560021   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973945-6740", Name:"mock2", UID:"2c6c290e-311e-4c57-aeea-2c6ca77d7d2b", APIVersion:"v1", ResourceVersion:"2922", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock2-5nwbf
generic-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(Bgeneric-resources.sh:104: Successful get rc mock2 {{.metadata.labels.status}}: replaced
(BE0114 03:52:35.905512   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:36.026357   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/mock edited
replicationcontroller/mock2 edited
E0114 03:52:36.141822   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(BE0114 03:52:36.261725   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:122: Successful get rc mock2 {{.metadata.labels.status}}: edited
(Breplicationcontroller/mock labeled
replicationcontroller/mock2 labeled
generic-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(Bgeneric-resources.sh:142: Successful get rc mock2 {{.metadata.labels.labeled}}: true
(Breplicationcontroller/mock annotated
replicationcontroller/mock2 annotated
generic-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(BE0114 03:52:36.906709   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:161: Successful get rc mock2 {{.metadata.annotations.annotated}}: true
(BE0114 03:52:37.027789   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller "mock" deleted
replicationcontroller "mock2" deleted
Testing with file hack/testdata/multi-resource-svclist.json and replace with file hack/testdata/multi-resource-svclist-modify.json
E0114 03:52:37.142807   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:52:37.262983   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Bservice/mock created
service/mock2 created
generic-resources.sh:70: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:mock2:
(BNAME    TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
mock    ClusterIP   10.0.0.157   <none>        99/TCP    0s
mock2   ClusterIP   10.0.0.89    <none>        99/TCP    0s
E0114 03:52:37.908008   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Name:              mock
Namespace:         namespace-1578973945-6740
Labels:            app=mock
Annotations:       <none>
Selector:          app=mock
Type:              ClusterIP
... skipping 14 lines ...
IP:                10.0.0.89
Port:              <unset>  99/TCP
TargetPort:        9949/TCP
Endpoints:         <none>
Session Affinity:  None
Events:            <none>
E0114 03:52:38.029235   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:38.144177   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock" deleted
service "mock2" deleted
service/mock replaced
service/mock2 replaced
E0114 03:52:38.264445   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(Bgeneric-resources.sh:98: Successful get services mock2 {{.metadata.labels.status}}: replaced
(Bservice/mock edited
service/mock2 edited
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(BE0114 03:52:38.909346   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:116: Successful get services mock2 {{.metadata.labels.status}}: edited
(Bservice/mock labeled
service/mock2 labeled
E0114 03:52:39.030140   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:39.145244   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(Bgeneric-resources.sh:136: Successful get services mock2 {{.metadata.labels.labeled}}: true
(BE0114 03:52:39.265856   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock annotated
service/mock2 annotated
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(Bgeneric-resources.sh:155: Successful get services mock2 {{.metadata.annotations.annotated}}: true
(Bservice "mock" deleted
service "mock2" deleted
generic-resources.sh:173: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:174: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:52:39.910470   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:40.031323   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:40.147109   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:40.266965   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock created
replicationcontroller/mock created
I0114 03:52:40.374959   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973945-6740", Name:"mock", UID:"d3dcb68c-b84d-4e3f-ac17-bda536d72181", APIVersion:"v1", ResourceVersion:"2984", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-2nxss
generic-resources.sh:180: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(Bgeneric-resources.sh:181: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(Bservice "mock" deleted
replicationcontroller "mock" deleted
E0114 03:52:40.911522   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:187: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:52:41.032601   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:188: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
Recording: run_persistent_volumes_tests
Running command: run_persistent_volumes_tests

+++ Running case: test-cmd.run_persistent_volumes_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_persistent_volumes_tests
E0114 03:52:41.148399   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ [0114 03:52:41] Creating namespace namespace-1578973961-23963
namespace/namespace-1578973961-23963 created
E0114 03:52:41.268233   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0114 03:52:41] Testing persistent volumes
storage.sh:30: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpersistentvolume/pv0001 created
E0114 03:52:41.633075   54689 pv_protection_controller.go:116] PV pv0001 failed with : Operation cannot be fulfilled on persistentvolumes "pv0001": the object has been modified; please apply your changes to the latest version and try again
storage.sh:33: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0001:
(Bpersistentvolume "pv0001" deleted
E0114 03:52:41.912654   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolume/pv0002 created
E0114 03:52:42.034000   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:42.149554   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:36: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0002:
(BE0114 03:52:42.269225   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolume "pv0002" deleted
persistentvolume/pv0003 created
storage.sh:39: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0003:
(Bpersistentvolume "pv0003" deleted
storage.sh:42: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:52:42.914482   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolume/pv0001 created
E0114 03:52:43.035398   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:45: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0001:
(BE0114 03:52:43.150745   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
persistentvolume "pv0001" deleted
has:warning: deleting cluster-scoped resources
Successful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
persistentvolume "pv0001" deleted
has:persistentvolume "pv0001" deleted
E0114 03:52:43.270543   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:49: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
Recording: run_persistent_volume_claims_tests
Running command: run_persistent_volume_claims_tests

+++ Running case: test-cmd.run_persistent_volume_claims_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_persistent_volume_claims_tests
+++ [0114 03:52:43] Creating namespace namespace-1578973963-1091
namespace/namespace-1578973963-1091 created
Context "test" modified.
+++ [0114 03:52:43] Testing persistent volumes claims
storage.sh:64: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:52:43.915483   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolumeclaim/myclaim-1 created
I0114 03:52:43.918080   54689 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578973963-1091", Name:"myclaim-1", UID:"e33384c9-d9e7-4a97-9e28-e7f5df278134", APIVersion:"v1", ResourceVersion:"3023", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
I0114 03:52:43.921051   54689 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578973963-1091", Name:"myclaim-1", UID:"e33384c9-d9e7-4a97-9e28-e7f5df278134", APIVersion:"v1", ResourceVersion:"3024", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
E0114 03:52:44.036613   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:67: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-1:
(Bpersistentvolumeclaim "myclaim-1" deleted
I0114 03:52:44.149062   54689 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578973963-1091", Name:"myclaim-1", UID:"e33384c9-d9e7-4a97-9e28-e7f5df278134", APIVersion:"v1", ResourceVersion:"3027", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
E0114 03:52:44.152046   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:44.272053   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolumeclaim/myclaim-2 created
I0114 03:52:44.392654   54689 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578973963-1091", Name:"myclaim-2", UID:"61647609-2375-4559-8c3a-1999ca0bb19d", APIVersion:"v1", ResourceVersion:"3030", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
I0114 03:52:44.394902   54689 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578973963-1091", Name:"myclaim-2", UID:"61647609-2375-4559-8c3a-1999ca0bb19d", APIVersion:"v1", ResourceVersion:"3031", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
storage.sh:71: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-2:
(Bpersistentvolumeclaim "myclaim-2" deleted
I0114 03:52:44.654337   54689 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578973963-1091", Name:"myclaim-2", UID:"61647609-2375-4559-8c3a-1999ca0bb19d", APIVersion:"v1", ResourceVersion:"3034", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
persistentvolumeclaim/myclaim-3 created
I0114 03:52:44.892110   54689 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578973963-1091", Name:"myclaim-3", UID:"04e336c1-3320-4962-9bd0-858c5f2c35ca", APIVersion:"v1", ResourceVersion:"3039", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
I0114 03:52:44.899853   54689 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578973963-1091", Name:"myclaim-3", UID:"04e336c1-3320-4962-9bd0-858c5f2c35ca", APIVersion:"v1", ResourceVersion:"3041", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
E0114 03:52:44.916780   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:75: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-3:
(BE0114 03:52:45.037772   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolumeclaim "myclaim-3" deleted
I0114 03:52:45.135929   54689 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578973963-1091", Name:"myclaim-3", UID:"04e336c1-3320-4962-9bd0-858c5f2c35ca", APIVersion:"v1", ResourceVersion:"3043", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
E0114 03:52:45.153168   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:45.273178   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:78: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
Recording: run_storage_class_tests
Running command: run_storage_class_tests

+++ Running case: test-cmd.run_storage_class_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_storage_class_tests
+++ [0114 03:52:45] Testing storage class
storage.sh:92: Successful get storageclass {{range.items}}{{.metadata.name}}:{{end}}: 
(Bstorageclass.storage.k8s.io/storage-class-name created
storage.sh:108: Successful get storageclass {{range.items}}{{.metadata.name}}:{{end}}: storage-class-name:
(BE0114 03:52:45.918121   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:109: Successful get sc {{range.items}}{{.metadata.name}}:{{end}}: storage-class-name:
(BE0114 03:52:46.039125   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storageclass.storage.k8s.io "storage-class-name" deleted
E0114 03:52:46.154506   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:112: Successful get storageclass {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
E0114 03:52:46.274410   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_nodes_tests
Running command: run_nodes_tests

+++ Running case: test-cmd.run_nodes_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_nodes_tests
... skipping 142 lines ...
  Resource           Requests  Limits
  --------           --------  ------
  cpu                0 (0%)    0 (0%)
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
(B
E0114 03:52:46.919305   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1383: Successful describe
Name:               127.0.0.1
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Tue, 14 Jan 2020 03:48:07 +0000
... skipping 35 lines ...
  --------           --------  ------
  cpu                0 (0%)    0 (0%)
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
Events:              <none>
(B
E0114 03:52:47.040738   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Labels:
matched CreationTimestamp:
matched Conditions:
E0114 03:52:47.156147   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Addresses:
matched Capacity:
matched Pods:
Successful describe nodes:
Name:               127.0.0.1
Roles:              <none>
... skipping 37 lines ...
  Resource           Requests  Limits
  --------           --------  ------
  cpu                0 (0%)    0 (0%)
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
Events:              <none>
(BE0114 03:52:47.275521   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:               127.0.0.1
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Tue, 14 Jan 2020 03:48:07 +0000
... skipping 128 lines ...
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
Events:              <none>
(Bcore.sh:1395: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode/127.0.0.1 patched
core.sh:1398: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: true
(BE0114 03:52:47.920714   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 patched
E0114 03:52:48.041982   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1401: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE0114 03:52:48.157291   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
tokenreview.authentication.k8s.io/<unknown> created
E0114 03:52:48.276839   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
tokenreview.authentication.k8s.io/<unknown> created
+++ exit code: 0
Recording: run_authorization_tests
Running command: run_authorization_tests

+++ Running case: test-cmd.run_authorization_tests 
... skipping 65 lines ...
  }
}
+++ exit code: 0
Successful
message:yes
has:yes
E0114 03:52:48.921847   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:yes
has:yes
E0114 03:52:49.043039   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:49.158554   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:49.277991   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Warning: the server doesn't have a resource type 'invalid_resource'
yes
has:the server doesn't have a resource type
Successful
message:yes
has:yes
Successful
message:error: --subresource can not be used with NonResourceURL
has:subresource can not be used with NonResourceURL
Successful
Successful
message:yes
0
has:0
Successful
message:0
has:0
E0114 03:52:49.923067   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:yes
has not:Warning
E0114 03:52:50.048745   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:50.160673   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:50.279353   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Warning: the server doesn't have a resource type 'foo'
yes
has:Warning: the server doesn't have a resource type 'foo'
Successful
message:Warning: the server doesn't have a resource type 'foo'
... skipping 6 lines ...
message:Warning: resource 'nodes' is not namespace scoped
yes
has:Warning: resource 'nodes' is not namespace scoped
Successful
message:yes
has not:Warning: resource 'nodes' is not namespace scoped
E0114 03:52:50.924595   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
clusterrole.rbac.authorization.k8s.io/testing-CR reconciled
	reconciliation required create
	missing rules added:
		{Verbs:[create delete deletecollection get list patch update watch] APIGroups:[] Resources:[pods] ResourceNames:[] NonResourceURLs:[]}
clusterrolebinding.rbac.authorization.k8s.io/testing-CRB reconciled
	reconciliation required create
... skipping 4 lines ...
	missing subjects added:
		{Kind:Group APIGroup:rbac.authorization.k8s.io Name:system:masters Namespace:}
role.rbac.authorization.k8s.io/testing-R reconciled
	reconciliation required create
	missing rules added:
		{Verbs:[get list watch] APIGroups:[] Resources:[configmaps] ResourceNames:[] NonResourceURLs:[]}
E0114 03:52:51.050127   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
legacy-script.sh:821: Successful get rolebindings -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-RB:
(BE0114 03:52:51.162184   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
legacy-script.sh:822: Successful get roles -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-R:
(BE0114 03:52:51.280898   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
legacy-script.sh:823: Successful get clusterrolebindings -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CRB:
(Blegacy-script.sh:824: Successful get clusterroles -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CR:
(BSuccessful
message:error: only rbac.authorization.k8s.io/v1 is supported: not *v1beta1.ClusterRole
has:only rbac.authorization.k8s.io/v1 is supported
rolebinding.rbac.authorization.k8s.io "testing-RB" deleted
role.rbac.authorization.k8s.io "testing-R" deleted
warning: deleting cluster-scoped resources, not scoped to the provided namespace
clusterrole.rbac.authorization.k8s.io "testing-CR" deleted
clusterrolebinding.rbac.authorization.k8s.io "testing-CRB" deleted
... skipping 2 lines ...

+++ Running case: test-cmd.run_retrieve_multiple_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_retrieve_multiple_tests
Context "test" modified.
+++ [0114 03:52:51] Testing kubectl(v1:multiget)
E0114 03:52:51.925762   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:242: Successful get nodes/127.0.0.1 service/kubernetes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:kubernetes:
(B+++ exit code: 0
Recording: run_resource_aliasing_tests
Running command: run_resource_aliasing_tests
E0114 03:52:52.051302   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource

+++ Running case: test-cmd.run_resource_aliasing_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_resource_aliasing_tests
+++ [0114 03:52:52] Creating namespace namespace-1578973972-5291
E0114 03:52:52.163463   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578973972-5291 created
Context "test" modified.
+++ [0114 03:52:52] Testing resource aliasing
E0114 03:52:52.282373   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/cassandra created
I0114 03:52:52.478108   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973972-5291", Name:"cassandra", UID:"f888efb4-4927-45e1-ac2b-edf2e69366fe", APIVersion:"v1", ResourceVersion:"3071", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-6h4qv
I0114 03:52:52.482623   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973972-5291", Name:"cassandra", UID:"f888efb4-4927-45e1-ac2b-edf2e69366fe", APIVersion:"v1", ResourceVersion:"3071", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-cd8hx
service/cassandra created
Waiting for Get all -l'app=cassandra' {{range.items}}{{range .metadata.labels}}{{.}}:{{end}}{{end}} : expected: cassandra:cassandra:cassandra:cassandra::, got: cassandra:cassandra:cassandra:cassandra:

discovery.sh:91: FAIL!
Get all -l'app=cassandra' {{range.items}}{{range .metadata.labels}}{{.}}:{{end}}{{end}}
  Expected: cassandra:cassandra:cassandra:cassandra::
  Got:      cassandra:cassandra:cassandra:cassandra:
(B
55 /home/prow/go/src/k8s.io/kubernetes/hack/lib/test.sh
(B
E0114 03:52:52.926936   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
discovery.sh:92: Successful get all -l'app=cassandra' {{range.items}}{{range .metadata.labels}}{{.}}:{{end}}{{end}}: cassandra:cassandra:cassandra:cassandra:
(BE0114 03:52:53.052687   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod "cassandra-6h4qv" deleted
I0114 03:52:53.109552   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973972-5291", Name:"cassandra", UID:"f888efb4-4927-45e1-ac2b-edf2e69366fe", APIVersion:"v1", ResourceVersion:"3077", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-k4z5j
I0114 03:52:53.111892   54689 event.go:278] Event(v1.ObjectReference{Kind:"Endpoints", Namespace:"namespace-1578973972-5291", Name:"cassandra", UID:"6956a0fd-ac78-47ae-a0fa-74fe164af69c", APIVersion:"v1", ResourceVersion:"3079", FieldPath:""}): type: 'Warning' reason: 'FailedToUpdateEndpoint' Failed to update endpoint namespace-1578973972-5291/cassandra: Operation cannot be fulfilled on endpoints "cassandra": the object has been modified; please apply your changes to the latest version and try again
pod "cassandra-cd8hx" deleted
I0114 03:52:53.119380   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973972-5291", Name:"cassandra", UID:"f888efb4-4927-45e1-ac2b-edf2e69366fe", APIVersion:"v1", ResourceVersion:"3088", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-shcbq
replicationcontroller "cassandra" deleted
service "cassandra" deleted
E0114 03:52:53.164623   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_kubectl_explain_tests
Running command: run_kubectl_explain_tests

+++ Running case: test-cmd.run_kubectl_explain_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
E0114 03:52:53.283521   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ command: run_kubectl_explain_tests
+++ [0114 03:52:53] Testing kubectl(v1:explain)
KIND:     Pod
VERSION:  v1

DESCRIPTION:
... skipping 64 lines ...

FIELD:    message <string>

DESCRIPTION:
     A human readable message indicating details about why the pod is in this
     condition.
E0114 03:52:53.928772   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
KIND:     CronJob
VERSION:  batch/v1beta1

DESCRIPTION:
     CronJob represents the configuration of a single cron job.

... skipping 20 lines ...
     https://git.k8s.io/community/contributors/devel/sig-architecture/api-conventions.md#spec-and-status

   status	<Object>
     Current status of a cron job. More info:
     https://git.k8s.io/community/contributors/devel/sig-architecture/api-conventions.md#spec-and-status

E0114 03:52:54.053832   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_swagger_tests
Running command: run_swagger_tests

+++ Running case: test-cmd.run_swagger_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_swagger_tests
E0114 03:52:54.165749   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ [0114 03:52:54] Testing swagger
E0114 03:52:54.284983   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_kubectl_sort_by_tests
Running command: run_kubectl_sort_by_tests

+++ Running case: test-cmd.run_kubectl_sort_by_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_sort_by_tests
+++ [0114 03:52:54] Testing kubectl --sort-by
get.sh:256: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BNo resources found in namespace-1578973972-5291 namespace.
No resources found in namespace-1578973972-5291 namespace.
get.sh:264: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:52:54.930188   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/valid-pod created
E0114 03:52:55.056995   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:55.166961   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:268: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BE0114 03:52:55.286107   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          0s
has:valid-pod
Successful
message:I0114 03:52:55.378891   86497 loader.go:375] Config loaded from file:  /tmp/tmp.VHuTjdTtMr/.kube/config
... skipping 27 lines ...
has:includeObject=Object
get.sh:279: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
get.sh:283: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bget.sh:288: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:52:55.931525   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:56.058387   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/sorted-pod1 created
E0114 03:52:56.168300   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:292: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:
(BE0114 03:52:56.287411   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/sorted-pod2 created
get.sh:296: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:sorted-pod2:
(Bpod/sorted-pod3 created
get.sh:300: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:sorted-pod2:sorted-pod3:
(BE0114 03:52:56.932713   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:sorted-pod1:sorted-pod2:sorted-pod3:
has:sorted-pod1:sorted-pod2:sorted-pod3:
Successful
message:sorted-pod3:sorted-pod2:sorted-pod1:
has:sorted-pod3:sorted-pod2:sorted-pod1:
E0114 03:52:57.059553   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:sorted-pod2:sorted-pod1:sorted-pod3:
has:sorted-pod2:sorted-pod1:sorted-pod3:
E0114 03:52:57.169455   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:sorted-pod1:sorted-pod2:sorted-pod3:
has:sorted-pod1:sorted-pod2:sorted-pod3:
E0114 03:52:57.288566   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:I0114:I0114:I0114:I0114:I0114:I0114:I0114:I0114:I0114:I0114:NAME:sorted-pod2:sorted-pod1:sorted-pod3:
has:sorted-pod2:sorted-pod1:sorted-pod3:
Successful
message:I0114 03:52:57.319360   86762 loader.go:375] Config loaded from file:  /tmp/tmp.VHuTjdTtMr/.kube/config
I0114 03:52:57.330135   86762 round_trippers.go:420] GET http://localhost:8080/api/v1/namespaces/namespace-1578973972-5291/pods
... skipping 23 lines ...

+++ Running case: test-cmd.run_kubectl_all_namespace_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_all_namespace_tests
+++ [0114 03:52:57] Testing kubectl --all-namespace
get.sh:342: Successful get namespaces {{range.items}}{{if eq .metadata.name \"default\"}}{{.metadata.name}}:{{end}}{{end}}: default:
(BE0114 03:52:57.933953   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:346: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 03:52:58.060958   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:58.170568   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/valid-pod created
E0114 03:52:58.289749   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:350: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BNAMESPACE                   NAME        READY   STATUS    RESTARTS   AGE
namespace-1578973972-5291   valid-pod   0/1     Pending   0          0s
namespace/all-ns-test-1 created
serviceaccount/test created
namespace/all-ns-test-2 created
... skipping 57 lines ...
namespace-1578973945-6740    default   0         32s
namespace-1578973961-23963   default   0         17s
namespace-1578973963-1091    default   0         15s
namespace-1578973972-5291    default   0         6s
some-other-random            default   0         8s
has:all-ns-test-2
E0114 03:52:58.935308   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:NAMESPACE                    NAME      SECRETS   AGE
all-ns-test-1                default   0         0s
all-ns-test-1                test      0         0s
all-ns-test-2                default   0         0s
all-ns-test-2                test      0         0s
... skipping 50 lines ...
namespace-1578973945-6740    default   0         32s
namespace-1578973961-23963   default   0         17s
namespace-1578973963-1091    default   0         15s
namespace-1578973972-5291    default   0         6s
some-other-random            default   0         8s
has:all-ns-test-2
E0114 03:52:59.062075   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace "all-ns-test-1" deleted
E0114 03:52:59.171908   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:59.290904   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:52:59.936561   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:00.063277   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:00.172996   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:00.292328   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:00.937825   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:01.064639   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:01.174095   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:01.293578   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:01.939076   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:02.065903   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:02.175207   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:02.294730   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:02.940450   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:03.067178   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:03.176281   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:03.296095   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:03.941551   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:04.068615   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:04.177394   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:04.297313   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace "all-ns-test-2" deleted
E0114 03:53:04.942867   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:05.069824   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:05.178792   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:05.298519   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:05.944435   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:06.071289   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:06.179928   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:06.300057   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:06.948848   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:07.072749   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:07.181024   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:07.301237   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:07.950144   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:08.074055   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:08.182402   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:08.302477   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:08.951413   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:09.075133   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:09.183945   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 03:53:09.196483   54689 namespace_controller.go:185] Namespace has been deleted all-ns-test-1
E0114 03:53:09.304026   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:376: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
get.sh:380: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bget.sh:384: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(BSuccessful
message:NAME        STATUS     ROLES    AGE    VERSION
127.0.0.1   NotReady   <none>   5m2s   
has not:NAMESPACE
E0114 03:53:09.952715   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_template_output_tests
Running command: run_template_output_tests

+++ Running case: test-cmd.run_template_output_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_template_output_tests
+++ [0114 03:53:10] Testing --template support on commands
+++ [0114 03:53:10] Creating namespace namespace-1578973990-3577
E0114 03:53:10.076613   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578973990-3577 created
E0114 03:53:10.185057   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
E0114 03:53:10.305167   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
template-output.sh:29: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/valid-pod created
{
    "apiVersion": "v1",
    "items": [
        {
... skipping 51 lines ...
    }
}
template-output.sh:35: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BSuccessful
message:valid-pod:
has:valid-pod:
E0114 03:53:10.953828   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:valid-pod:
has:valid-pod:
Successful
message:valid-pod:
has:valid-pod:
E0114 03:53:11.077750   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:11.186297   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:valid-pod:
has:valid-pod:
E0114 03:53:11.306431   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:valid-pod:
has:valid-pod:
Successful
message:scale-1:
has:scale-1:
... skipping 6 lines ...
message:nginx:
has:nginx:
kubectl run --generator=job/v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
Successful
message:pi:
has:pi:
E0114 03:53:11.954879   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:127.0.0.1:
has:127.0.0.1:
node/127.0.0.1 untainted
E0114 03:53:12.078842   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:12.187460   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/cassandra created
I0114 03:53:12.268090   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973990-3577", Name:"cassandra", UID:"5bd4f953-96c5-43e1-b7e3-a90935149477", APIVersion:"v1", ResourceVersion:"3157", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-r6v6s
I0114 03:53:12.272124   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973990-3577", Name:"cassandra", UID:"5bd4f953-96c5-43e1-b7e3-a90935149477", APIVersion:"v1", ResourceVersion:"3157", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-dm5f6
E0114 03:53:12.307275   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:cassandra:
has:cassandra:
	reconciliation required create
	missing rules added:
		{Verbs:[create delete deletecollection get list patch update watch] APIGroups:[] Resources:[pods] ResourceNames:[] NonResourceURLs:[]}
... skipping 20 lines ...
has:cm:
I0114 03:53:12.892330   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578973990-3577", Name:"deploy", UID:"1408c418-3300-42d0-9eec-1b056add4f55", APIVersion:"apps/v1", ResourceVersion:"3166", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set deploy-74bcc58696 to 1
I0114 03:53:12.896877   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973990-3577", Name:"deploy-74bcc58696", UID:"0342b6e6-e71c-49b2-9add-d2cb9829424a", APIVersion:"apps/v1", ResourceVersion:"3167", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: deploy-74bcc58696-wvgfn
Successful
message:deploy:
has:deploy:
E0114 03:53:12.955912   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:13.080122   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
cronjob.batch/pi created
E0114 03:53:13.188686   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
E0114 03:53:13.308577   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:bar:
has:bar:
Successful
message:foo:
has:foo:
... skipping 9 lines ...
Successful
message:foo:
has:foo:
Successful
message:foo:
has:foo:
E0114 03:53:13.957015   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:valid-pod:
has:valid-pod:
E0114 03:53:14.081208   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:valid-pod:
has:valid-pod:
Successful
message:valid-pod:
has:valid-pod:
E0114 03:53:14.189749   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:kubernetes:
has:kubernetes:
E0114 03:53:14.312686   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:valid-pod:
has:valid-pod:
I0114 03:53:14.419025   54689 namespace_controller.go:185] Namespace has been deleted all-ns-test-2
Successful
message:foo:
... skipping 7 lines ...
Successful
message:foo:
has:foo:
Successful
message:foo:
has:foo:
E0114 03:53:14.957971   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
Successful
message:foo:
has:foo:
E0114 03:53:15.082395   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
E0114 03:53:15.190969   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:apiVersion: v1
clusters:
- cluster:
    certificate-authority-data: DATA+OMITTED
    server: https://does-not-work
... skipping 6 lines ...
  name: test
current-context: test
kind: Config
preferences: {}
users: null
has:kind: Config
E0114 03:53:15.313830   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:deploy:
has:deploy:
Successful
message:deploy:
has:deploy:
... skipping 3 lines ...
Successful
message:deploy:
has:deploy:
Successful
message:Config:
has:Config
E0114 03:53:15.959150   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:apiVersion: v1
kind: ConfigMap
metadata:
  creationTimestamp: null
  name: cm
has:kind: ConfigMap
E0114 03:53:16.083983   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
cronjob.batch "pi" deleted
E0114 03:53:16.192448   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod "cassandra-dm5f6" deleted
I0114 03:53:16.210953   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973990-3577", Name:"cassandra", UID:"5bd4f953-96c5-43e1-b7e3-a90935149477", APIVersion:"v1", ResourceVersion:"3163", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-r486v
pod "cassandra-r6v6s" deleted
I0114 03:53:16.221227   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578973990-3577", Name:"cassandra", UID:"5bd4f953-96c5-43e1-b7e3-a90935149477", APIVersion:"v1", ResourceVersion:"3163", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-t5rgk
pod "deploy-74bcc58696-wvgfn" deleted
I0114 03:53:16.227894   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578973990-3577", Name:"deploy-74bcc58696", UID:"0342b6e6-e71c-49b2-9add-d2cb9829424a", APIVersion:"apps/v1", ResourceVersion:"3174", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: deploy-74bcc58696-rmknq
pod "valid-pod" deleted
E0114 03:53:16.314892   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller "cassandra" deleted
clusterrole.rbac.authorization.k8s.io "myclusterrole" deleted
clusterrolebinding.rbac.authorization.k8s.io "foo" deleted
deployment.apps "deploy" deleted
+++ exit code: 0
Recording: run_certificates_tests
Running command: run_certificates_tests

+++ Running case: test-cmd.run_certificates_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_certificates_tests
+++ [0114 03:53:16] Testing certificates
certificatesigningrequest.certificates.k8s.io/foo created
E0114 03:53:16.960726   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:17.085032   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:29: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
(Bcertificatesigningrequest.certificates.k8s.io/foo approved
E0114 03:53:17.193581   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
{
    "apiVersion": "v1",
    "items": [
        {
            "apiVersion": "certificates.k8s.io/v1beta1",
            "kind": "CertificateSigningRequest",
... skipping 49 lines ...
    "kind": "List",
    "metadata": {
        "resourceVersion": "",
        "selfLink": ""
    }
}
E0114 03:53:17.316150   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:32: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: Approved
(Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
certificate.sh:34: Successful get csr {{range.items}}{{.metadata.name}}{{end}}: 
(Bcertificatesigningrequest.certificates.k8s.io/foo created
certificate.sh:37: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
(BE0114 03:53:17.961805   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo approved
{
    "apiVersion": "v1",
    "items": [
        {
            "apiVersion": "certificates.k8s.io/v1beta1",
... skipping 32 lines ...
    "kind": "List",
    "metadata": {
        "resourceVersion": "",
        "selfLink": ""
    }
}
E0114 03:53:18.086050   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:40: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: Approved
(BE0114 03:53:18.194700   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io "foo" deleted
E0114 03:53:18.317185   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:42: Successful get csr {{range.items}}{{.metadata.name}}{{end}}: 
(Bcertificatesigningrequest.certificates.k8s.io/foo created
certificate.sh:46: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
(Bcertificatesigningrequest.certificates.k8s.io/foo denied
{
    "apiVersion": "v1",
... skipping 35 lines ...
    "kind": "List",
    "metadata": {
        "resourceVersion": "",
        "selfLink": ""
    }
}
E0114 03:53:18.962686   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:49: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: Denied
(Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
E0114 03:53:19.087121   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:19.195918   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:51: Successful get csr {{range.items}}{{.metadata.name}}{{end}}: 
(BE0114 03:53:19.318445   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo created
certificate.sh:54: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
(Bcertificatesigningrequest.certificates.k8s.io/foo denied
{
    "apiVersion": "v1",
    "items": [
... skipping 54 lines ...
        "resourceVersion": "",
        "selfLink": ""
    }
}
certificate.sh:57: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: Denied
(Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
E0114 03:53:19.963902   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:59: Successful get csr {{range.items}}{{.metadata.name}}{{end}}: 
(B+++ exit code: 0
Recording: run_cluster_management_tests
Running command: run_cluster_management_tests

E0114 03:53:20.088346   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ Running case: test-cmd.run_cluster_management_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_cluster_management_tests
+++ [0114 03:53:20] Testing cluster-management commands
E0114 03:53:20.197025   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:27: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(BE0114 03:53:20.319786   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/test-pod-1 created
pod/test-pod-2 created
node-management.sh:76: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: 
(Bnode/127.0.0.1 tainted
node-management.sh:79: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: dedicated=foo:PreferNoSchedule
(BE0114 03:53:20.965089   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 untainted
E0114 03:53:21.089536   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:83: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: 
(BE0114 03:53:21.198113   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:87: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE0114 03:53:21.321026   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 cordoned (dry run)
node-management.sh:89: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode-management.sh:93: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode/127.0.0.1 cordoned (dry run)
node/127.0.0.1 drained (dry run)
node-management.sh:96: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(Bnode-management.sh:97: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode-management.sh:101: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE0114 03:53:21.966166   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:103: Successful get pods {{range .items}}{{.metadata.name}},{{end}}: test-pod-1,test-pod-2,
(BE0114 03:53:22.090647   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 cordoned
node/127.0.0.1 drained
E0114 03:53:22.199243   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:106: Successful get pods/test-pod-2 {{.metadata.name}}: test-pod-2
(BE0114 03:53:22.322485   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod "test-pod-2" deleted
node/127.0.0.1 uncordoned
node-management.sh:111: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode-management.sh:115: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BSuccessful
message:node/127.0.0.1 already uncordoned (dry run)
has:already uncordoned
node-management.sh:119: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE0114 03:53:22.967429   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 labeled
E0114 03:53:23.091870   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:124: Successful get nodes 127.0.0.1 {{.metadata.labels.test}}: label
(BE0114 03:53:23.200552   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: cannot specify both a node name and a --selector option
See 'kubectl drain -h' for help and examples
has:cannot specify both a node name
Successful
message:error: USAGE: cordon NODE [flags]
See 'kubectl cordon -h' for help and examples
has:error\: USAGE\: cordon NODE
E0114 03:53:23.323960   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 already uncordoned
Successful
message:error: You must provide one or more resources by argument or filename.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
   '<resource> <name>'
   '<resource>'
has:must provide one or more resources
... skipping 9 lines ...
Running command: run_plugins_tests

+++ Running case: test-cmd.run_plugins_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_plugins_tests
+++ [0114 03:53:23] Testing kubectl plugins
E0114 03:53:23.968881   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/version/kubectl-version
  - warning: kubectl-version overwrites existing command: "kubectl version"

error: one plugin warning was found
has:kubectl-version overwrites existing command: "kubectl version"
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/kubectl-foo
test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo
  - warning: test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin: test/fixtures/pkg/kubectl/plugins/kubectl-foo

error: one plugin warning was found
has:test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin
E0114 03:53:24.093127   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/kubectl-foo
has:plugins are available
E0114 03:53:24.201752   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Unable read directory "test/fixtures/pkg/kubectl/plugins/empty" from your PATH: open test/fixtures/pkg/kubectl/plugins/empty: no such file or directory. Skipping...
error: unable to find any kubectl plugins in your PATH
has:unable to find any kubectl plugins in your PATH
Successful
message:I am plugin foo
has:plugin foo
E0114 03:53:24.325248   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:I am plugin bar called with args test/fixtures/pkg/kubectl/plugins/bar/kubectl-bar arg1
has:test/fixtures/pkg/kubectl/plugins/bar/kubectl-bar arg1
Successful
message:Client Version: version.Info{Major:"1", Minor:"18+", GitVersion:"v1.18.0-alpha.1.658+61d36e4a43b831", GitCommit:"61d36e4a43b831f960b190b81d371cb33b5f20d1", GitTreeState:"clean", BuildDate:"2020-01-14T01:41:08Z", GoVersion:"go1.13.5", Compiler:"gc", Platform:"linux/amd64"}
has:Client Version
... skipping 6 lines ...

+++ Running case: test-cmd.run_impersonation_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_impersonation_tests
+++ [0114 03:53:24] Testing impersonation
Successful
message:error: requesting groups or user-extra for  without impersonating a user
has:without impersonating a user
E0114 03:53:24.970182   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo created
E0114 03:53:25.094270   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
authorization.sh:68: Successful get csr/foo {{.spec.username}}: user1
(BE0114 03:53:25.203158   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
authorization.sh:69: Successful get csr/foo {{range .spec.groups}}{{.}}{{end}}: system:authenticated
(BE0114 03:53:25.326540   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io "foo" deleted
certificatesigningrequest.certificates.k8s.io/foo created
authorization.sh:74: Successful get csr/foo {{len .spec.groups}}: 3
(Bauthorization.sh:75: Successful get csr/foo {{range .spec.groups}}{{.}} {{end}}: group2 group1 ,,,chameleon 
(Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
+++ exit code: 0
E0114 03:53:25.974169   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_wait_tests
Running command: run_wait_tests

+++ Running case: test-cmd.run_wait_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_wait_tests
+++ [0114 03:53:26] Testing kubectl wait
+++ [0114 03:53:26] Creating namespace namespace-1578974006-26768
E0114 03:53:26.095518   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578974006-26768 created
E0114 03:53:26.204865   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
E0114 03:53:26.327892   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/test-1 created
I0114 03:53:26.361590   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578974006-26768", Name:"test-1", UID:"e6d93410-f2f8-4fdf-a861-d0ad6a332163", APIVersion:"apps/v1", ResourceVersion:"3261", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-1-6d98955cc9 to 1
I0114 03:53:26.367015   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578974006-26768", Name:"test-1-6d98955cc9", UID:"8cc47d3d-1ac5-433f-99cc-2e0725216c5a", APIVersion:"apps/v1", ResourceVersion:"3262", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-1-6d98955cc9-vnlqx
deployment.apps/test-2 created
I0114 03:53:26.458964   54689 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578974006-26768", Name:"test-2", UID:"e35e8aed-1e20-41a1-973b-3e40adfc5e16", APIVersion:"apps/v1", ResourceVersion:"3271", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-2-65897ff84d to 1
I0114 03:53:26.466143   54689 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578974006-26768", Name:"test-2-65897ff84d", UID:"5e963a3c-a467-45a9-aa6b-80d414e95a6f", APIVersion:"apps/v1", ResourceVersion:"3272", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-2-65897ff84d-w6b4h
wait.sh:36: Successful get deployments {{range .items}}{{.metadata.name}},{{end}}: test-1,test-2,
(BE0114 03:53:26.975372   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:27.096781   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:27.206096   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:27.329221   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:27.976896   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:28.098182   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:28.207080   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 03:53:28.330467   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "test-1" deleted
deployment.apps "test-2" deleted
Successful
message:deployment.apps/test-1 condition met
deployment.apps/test-2 condition met
has:test-1 condition met
... skipping 4 lines ...
+++ exit code: 0
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
No resources found
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
No resources found
+++ [0114 03:53:28] TESTS PASSED
E0114 03:53:28.978095   54689 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 03:53:28.979059   51249 controller.go:180] Shutting down kubernetes service endpoint reconciler
I0114 03:53:28.979234   51249 dynamic_serving_content.go:144] Shutting down serving-cert::/tmp/apiserver.crt::/tmp/apiserver.key
I0114 03:53:28.979290   51249 secure_serving.go:222] Stopped listening on 127.0.0.1:8080
I0114 03:53:28.979300   51249 autoregister_controller.go:164] Shutting down autoregister controller
I0114 03:53:28.979311   51249 controller.go:123] Shutting down OpenAPI controller
I0114 03:53:28.979384   51249 crdregistration_controller.go:142] Shutting down crd-autoregister controller
... skipping 9 lines ...
I0114 03:53:28.979571   51249 naming_controller.go:300] Shutting down NamingConditionController
I0114 03:53:28.979572   51249 available_controller.go:398] Shutting down AvailableConditionController
I0114 03:53:28.980010   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 03:53:28.980231   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 03:53:28.980246   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 03:53:28.980260   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 03:53:28.980337   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 03:53:28.980458   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 03:53:28.980607   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 03:53:28.980711   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 03:53:28.980754   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 03:53:28.980976   51249 secure_serving.go:222] Stopped listening on 127.0.0.1:6443
I0114 03:53:28.980999   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 03:53:28.981132   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 03:53:28.981190   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 03:53:28.981385   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 03:53:28.981436   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.981497   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.981536   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.981605   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 03:53:28.981612   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 03:53:28.981661   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 03:53:28.981681   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 03:53:28.981706   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 03:53:28.981712   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.981746   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 03:53:28.981777   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 03:53:28.981778   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 03:53:28.981829   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 03:53:28.981849   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 03:53:28.981858   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 03:53:28.981888   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.981900   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 03:53:28.981949   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 03:53:28.981951   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 03:53:28.982004   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 03:53:28.981661   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 03:53:28.982021   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 03:53:28.982052   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 03:53:28.982056   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.982103   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 03:53:28.982157   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 03:53:28.982210   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 03:53:28.982236   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 03:53:28.981502   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 03:53:28.982308   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 03:53:28.982359   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 03:53:28.981713   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 03:53:28.982415   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.982427   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.982528   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 03:53:28.982536   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 03:53:28.982554   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 03:53:28.982700   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 03:53:28.982704   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 03:53:28.982749   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.982760   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.982772   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 03:53:28.982826   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 03:53:28.982919   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 03:53:28.982987   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 03:53:28.983059   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 03:53:28.983063   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 03:53:28.983101   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 03:53:28.983167   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 03:53:28.983249   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 03:53:28.983277   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983370   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 03:53:28.983396   51249 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 03:53:28.983400   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983410   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983432   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983436   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983451   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983469   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983485   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983498   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983508   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983512   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983517   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983540   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983548   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983561   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983565   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983577   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983584   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983589   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983472   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983605   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983611   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983620   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983665   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983673   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983553   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983701   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983706   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983725   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983726   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983729   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983398   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983758   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983765   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983678   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983772   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983737   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983820   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983825   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983841   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983852   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.983871   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:28.984117   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
junit report dir: /logs/artifacts
+++ [0114 03:53:29] Clean up complete
+ make test-integration
W0114 03:53:29.980794   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.980917   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.981897   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.981912   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.981945   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.981948   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.981977   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.981991   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.982082   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.982142   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.982176   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.982143   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.982463   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.982508   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.982506   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.982664   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.982687   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.982691   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.982965   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.983013   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.983016   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.983035   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.983965   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984036   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984036   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984045   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984065   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984103   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984108   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984131   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984148   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984157   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984195   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984209   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984213   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984249   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984248   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984260   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984276   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984300   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984310   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984321   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984323   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984414   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984473   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984477   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984511   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984519   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984528   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984535   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984568   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984576   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984600   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984658   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984665   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984698   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984709   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984708   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984740   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984908   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984928   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984928   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984940   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984945   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.984975   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:29.985052   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.284555   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.291089   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.300645   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.303946   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.305643   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.316763   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.324519   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.326077   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.343025   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.346962   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.350342   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.378608   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.379049   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.383490   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.385006   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.389703   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.401165   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.406054   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.412195   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.412598   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.413613   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.417640   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.425482   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.429632   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.430252   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.450719   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.454071   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.465288   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.465387   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.477386   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.478753   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.495427   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.501045   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.529885   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.531039   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.565412   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.580001   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.580000   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.587100   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.589504   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.591924   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.602887   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.609750   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.635417   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.655851   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.679376   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.684659   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.700512   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.720283   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.735263   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.780732   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.787532   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.791088   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.793628   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.813400   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.815177   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.823118   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.823901   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.826974   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.833596   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.852273   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.858906   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.860743   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.872315   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.889755   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:31.900019   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:33.398406   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:33.501785   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:33.504408   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:33.521171   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:33.558254   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:33.565716   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:33.580027   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:33.617779   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:33.670347   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:33.678265   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
+++ [0114 03:53:33] Checking etcd is on PATH
/home/prow/go/src/k8s.io/kubernetes/third_party/etcd/etcd
W0114 03:53:33.709946   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
+++ [0114 03:53:33] Starting etcd instance
W0114 03:53:33.721465   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:33.751615   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 03:53:33.756556   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
etcd --advertise-client-urls http://127.0.0.1:2379 --data-dir /tmp/tmp.0irsqoIpZ0 --listen-client-urls http://127.0.0.1:2379 --debug > "/logs/artifacts/etcd.0e2b3233-367f-11ea-9f20-3687633bf296.root.log.DEBUG.20200114-035333.90806" 2>/dev/null
Waiting for etcd to come up.
W0114 03:53:33.776383   51249 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
E0114 03:53:34.524194   51249 control