This job view page is being replaced by Spyglass soon. Check out the new job view.
ResultFAILURE
Tests 1 failed / 2610 succeeded
Started2020-01-14 01:36
Elapsed26m1s
Revisionmaster
links{u'resultstore': {u'url': u'https://source.cloud.google.com/results/invocations/97d49536-4c40-459e-89c5-3c26b2b26887/targets/test'}}
resultstorehttps://source.cloud.google.com/results/invocations/97d49536-4c40-459e-89c5-3c26b2b26887/targets/test

Test Failures


k8s.io/kubernetes/test/integration/client TestDynamicClient 6.87s

go test -v k8s.io/kubernetes/test/integration/client -run TestDynamicClient$
=== RUN   TestDynamicClient
I0114 01:53:44.269198  106069 establishing_controller.go:85] Shutting down EstablishingController
I0114 01:53:45.242100  106069 serving.go:307] Generated self-signed cert (/tmp/kubernetes-kube-apiserver447156705/apiserver.crt, /tmp/kubernetes-kube-apiserver447156705/apiserver.key)
I0114 01:53:45.242135  106069 server.go:596] external host was not specified, using 127.0.0.1
W0114 01:53:45.242146  106069 authentication.go:439] AnonymousAuth is not allowed with the AlwaysAllow authorizer. Resetting AnonymousAuth to false. You should use a different authorizer
W0114 01:53:45.579199  106069 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 01:53:45.579230  106069 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 01:53:45.579243  106069 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 01:53:45.579415  106069 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 01:53:45.580432  106069 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 01:53:45.580475  106069 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 01:53:45.580502  106069 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 01:53:45.580528  106069 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 01:53:45.580752  106069 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 01:53:45.580933  106069 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 01:53:45.580968  106069 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 01:53:45.581025  106069 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0114 01:53:45.581042  106069 plugins.go:158] Loaded 9 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,MutatingAdmissionWebhook,RuntimeClass.
I0114 01:53:45.581050  106069 plugins.go:161] Loaded 6 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ValidatingAdmissionWebhook,RuntimeClass,ResourceQuota.
I0114 01:53:45.582502  106069 plugins.go:158] Loaded 9 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,MutatingAdmissionWebhook,RuntimeClass.
I0114 01:53:45.582600  106069 plugins.go:161] Loaded 6 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ValidatingAdmissionWebhook,RuntimeClass,ResourceQuota.
I0114 01:53:45.584417  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.584457  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.585487  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.585517  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W0114 01:53:45.617721  106069 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0114 01:53:45.618997  106069 master.go:264] Using reconciler: lease
I0114 01:53:45.619248  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.619285  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.622477  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.622545  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.623572  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.623606  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.626723  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.626758  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.627851  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.627877  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.628838  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.628863  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.630546  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.630572  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.631568  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.631591  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.633399  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.633431  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.635591  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.635619  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.636584  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.636613  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.637629  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.637653  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.639045  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.639074  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.641407  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.641439  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.642669  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.642702  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.643630  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.643665  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.644320  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.644490  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.645364  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.645398  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.646148  106069 rest.go:113] the default service ipfamily for this cluster is: IPv4
I0114 01:53:45.831482  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.831521  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.832701  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.832755  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.834004  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.834035  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.835125  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.835158  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.836472  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.836510  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.837492  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.837524  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.838683  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.838712  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.839756  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.839832  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.841533  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.841567  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.842674  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.842705  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.843561  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.843588  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.844768  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.844796  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.845720  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.845758  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.847064  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.847138  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.848190  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.848230  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.852239  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.852300  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.854149  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.854185  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.855554  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.855588  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.856911  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.856955  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.858082  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.858125  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.859135  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.859223  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.860737  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.860767  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.863147  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.863253  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.864487  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.864513  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.865561  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.865587  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.867044  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.867068  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.867833  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.867935  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.869228  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.869257  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.870707  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.870731  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.872280  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.872312  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.876446  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.876481  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.877592  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.877626  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.878796  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.878827  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.879974  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.880155  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.881392  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.881416  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.883454  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.883513  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.884616  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.884646  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.885591  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.885631  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.887117  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.887294  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.888412  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.888554  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.890549  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.890587  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.891465  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.891507  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.893077  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.893121  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.894219  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.894244  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.895629  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.895836  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.897016  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.897050  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.898077  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.898185  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.899362  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.899390  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.900321  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.900393  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.901466  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.901494  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.903668  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.903703  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.904745  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.904780  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.908202  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.908249  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.909636  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.909666  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
E0114 01:53:46.090492  106069 controller.go:183] an error on the server ("") has prevented the request from succeeding (get endpoints kubernetes)
W0114 01:53:46.157187  106069 genericapiserver.go:404] Skipping API discovery.k8s.io/v1alpha1 because it has no resources.
I0114 01:53:46.582451  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:46.582539  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W0114 01:53:46.695917  106069 genericapiserver.go:404] Skipping API apps/v1beta2 because it has no resources.
W0114 01:53:46.695945  106069 genericapiserver.go:404] Skipping API apps/v1beta1 because it has no resources.
I0114 01:53:46.714243  106069 plugins.go:158] Loaded 9 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,MutatingAdmissionWebhook,RuntimeClass.
I0114 01:53:46.714399  106069 plugins.go:161] Loaded 6 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ValidatingAdmissionWebhook,RuntimeClass,ResourceQuota.
W0114 01:53:46.715820  106069 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0114 01:53:46.716097  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:46.716134  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:46.717546  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:46.717577  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W0114 01:53:46.720722  106069 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0114 01:53:46.721305  106069 aggregator.go:182] Skipping APIService creation for flowcontrol.apiserver.k8s.io/v1alpha1
W0114 01:53:49.269691  106069 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.StorageClass ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0114 01:53:49.269717  106069 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.Secret ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0114 01:53:49.269765  106069 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.MutatingWebhookConfiguration ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0114 01:53:49.269775  106069 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.Endpoints ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0114 01:53:49.269815  106069 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.Pod ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0114 01:53:49.269859  106069 reflector.go:340] k8s.io/kubernetes/pkg/master/controller/clusterauthenticationtrust/cluster_authentication_trust_controller.go:444: watch of *v1.ConfigMap ended with: very short watch: k8s.io/kubernetes/pkg/master/controller/clusterauthenticationtrust/cluster_authentication_trust_controller.go:444: Unexpected watch close - watch lasted less than a second and no items received
W0114 01:53:49.269862  106069 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1beta1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0114 01:53:49.269930  106069 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.ResourceQuota ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
I0114 01:53:49.952502  106069 dynamic_cafile_content.go:166] Starting request-header::/tmp/kubernetes-kube-apiserver447156705/proxy-ca.crt
I0114 01:53:49.953073  106069 dynamic_cafile_content.go:166] Starting client-ca-bundle::/tmp/kubernetes-kube-apiserver447156705/client-ca.crt
I0114 01:53:49.953135  106069 dynamic_serving_content.go:129] Starting serving-cert::/tmp/kubernetes-kube-apiserver447156705/apiserver.crt::/tmp/kubernetes-kube-apiserver447156705/apiserver.key
I0114 01:53:49.953559  106069 secure_serving.go:178] Serving securely on 127.0.0.1:42985
I0114 01:53:49.953581  106069 tlsconfig.go:241] Starting DynamicServingCertificateController
I0114 01:53:49.953622  106069 available_controller.go:386] Starting AvailableConditionController
I0114 01:53:49.953635  106069 cache.go:32] Waiting for caches to sync for AvailableConditionController controller
I0114 01:53:49.953734  106069 crd_finalizer.go:264] Starting CRDFinalizer
I0114 01:53:49.953761  106069 apiservice_controller.go:94] Starting APIServiceRegistrationController
I0114 01:53:49.953767  106069 controller.go:86] Starting OpenAPI controller
I0114 01:53:49.953773  106069 cache.go:32] Waiting for caches to sync for APIServiceRegistrationController controller
I0114 01:53:49.953796  106069 customresource_discovery_controller.go:209] Starting DiscoveryController
I0114 01:53:49.953814  106069 naming_controller.go:289] Starting NamingConditionController
I0114 01:53:49.953829  106069 establishing_controller.go:74] Starting EstablishingController
I0114 01:53:49.953849  106069 nonstructuralschema_controller.go:185] Starting NonStructuralSchemaConditionController
I0114 01:53:49.953866  106069 apiapproval_controller.go:184] Starting KubernetesAPIApprovalPolicyConformantConditionController
I0114 01:53:49.953867  106069 autoregister_controller.go:140] Starting autoregister controller
I0114 01:53:49.953883  106069 cache.go:32] Waiting for caches to sync for autoregister controller
W0114 01:53:49.954482  106069 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0114 01:53:49.954634  106069 cluster_authentication_trust_controller.go:440] Starting cluster_authentication_trust_controller controller
I0114 01:53:49.954642  106069 shared_informer.go:206] Waiting for caches to sync for cluster_authentication_trust_controller
I0114 01:53:49.955117  106069 dynamic_cafile_content.go:166] Starting client-ca-bundle::/tmp/kubernetes-kube-apiserver447156705/client-ca.crt
I0114 01:53:49.955163  106069 dynamic_cafile_content.go:166] Starting request-header::/tmp/kubernetes-kube-apiserver447156705/proxy-ca.crt
I0114 01:53:49.956064  106069 controller.go:81] Starting OpenAPI AggregationController
I0114 01:53:49.956945  106069 crdregistration_controller.go:111] Starting crd-autoregister controller
I0114 01:53:49.956960  106069 shared_informer.go:206] Waiting for caches to sync for crd-autoregister
E0114 01:53:49.960086  106069 controller.go:151] Unable to remove old endpoints from kubernetes service: StorageError: key not found, Code: 1, Key: /e0452299-09a2-454d-a942-b4a8d60613f1/registry/masterleases/127.0.0.1, ResourceVersion: 0, AdditionalErrorMsg: 
I0114 01:53:50.053870  106069 cache.go:39] Caches are synced for AvailableConditionController controller
I0114 01:53:50.054045  106069 cache.go:39] Caches are synced for autoregister controller
I0114 01:53:50.054092  106069 cache.go:39] Caches are synced for APIServiceRegistrationController controller
I0114 01:53:50.054781  106069 shared_informer.go:213] Caches are synced for cluster_authentication_trust_controller 
I0114 01:53:50.057101  106069 shared_informer.go:213] Caches are synced for crd-autoregister 
E0114 01:53:50.088859  106069 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0114 01:53:50.105868  106069 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0114 01:53:50.116031  106069 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0114 01:53:50.120535  106069 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0114 01:53:50.123036  106069 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
I0114 01:53:50.952571  106069 controller.go:107] OpenAPI AggregationController: Processing item 
I0114 01:53:50.952612  106069 controller.go:130] OpenAPI AggregationController: action for item : Nothing (removed from the queue).
I0114 01:53:50.952631  106069 controller.go:130] OpenAPI AggregationController: action for item k8s_internal_local_delegation_chain_0000000000: Nothing (removed from the queue).
I0114 01:53:50.962826  106069 storage_scheduling.go:133] created PriorityClass system-node-critical with value 2000001000
I0114 01:53:50.967546  106069 storage_scheduling.go:133] created PriorityClass system-cluster-critical with value 2000000000
I0114 01:53:50.967566  106069 storage_scheduling.go:142] all system priority classes are created successfully or already exist.
W0114 01:53:51.009457  106069 lease.go:224] Resetting endpoints for master service "kubernetes" to [127.0.0.1]
E0114 01:53:51.011155  106069 controller.go:222] unable to sync kubernetes service: Endpoints "kubernetes" is invalid: subsets[0].addresses[0].ip: Invalid value: "127.0.0.1": may not be in the loopback range (127.0.0.0/8)
W0114 01:53:51.135943  106069 cacher.go:162] Terminating all watchers from cacher *apiextensions.CustomResourceDefinition
W0114 01:53:51.136215  106069 cacher.go:162] Terminating all watchers from cacher *core.LimitRange
W0114 01:53:51.136341  106069 cacher.go:162] Terminating all watchers from cacher *core.ResourceQuota
W0114 01:53:51.136517  106069 cacher.go:162] Terminating all watchers from cacher *core.Secret
W0114 01:53:51.136907  106069 cacher.go:162] Terminating all watchers from cacher *core.ConfigMap
W0114 01:53:51.136970  106069 cacher.go:162] Terminating all watchers from cacher *core.Namespace
W0114 01:53:51.137089  106069 cacher.go:162] Terminating all watchers from cacher *core.Endpoints
W0114 01:53:51.137406  106069 cacher.go:162] Terminating all watchers from cacher *core.Pod
W0114 01:53:51.137654  106069 cacher.go:162] Terminating all watchers from cacher *core.ServiceAccount
W0114 01:53:51.138064  106069 cacher.go:162] Terminating all watchers from cacher *core.Service
W0114 01:53:51.140152  106069 cacher.go:162] Terminating all watchers from cacher *node.RuntimeClass
W0114 01:53:51.141195  106069 cacher.go:162] Terminating all watchers from cacher *scheduling.PriorityClass
W0114 01:53:51.141878  106069 cacher.go:162] Terminating all watchers from cacher *storage.StorageClass
W0114 01:53:51.143029  106069 cacher.go:162] Terminating all watchers from cacher *admissionregistration.ValidatingWebhookConfiguration
W0114 01:53:51.143074  106069 cacher.go:162] Terminating all watchers from cacher *admissionregistration.MutatingWebhookConfiguration
W0114 01:53:51.143256  106069 cacher.go:162] Terminating all watchers from cacher *apiregistration.APIService
I0114 01:53:51.143402  106069 controller.go:180] Shutting down kubernetes service endpoint reconciler
--- FAIL: TestDynamicClient (6.87s)
    testserver.go:181: runtime-config=map[api/all:true]
    testserver.go:182: Starting kube-apiserver on port 42985...
    testserver.go:198: Waiting for /healthz to be ok...
    dynamic_client_test.go:88: unexpected pod in list. wanted &v1.Pod{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"testjhlj7", GenerateName:"test", Namespace:"default", SelfLink:"/api/v1/namespaces/default/pods/testjhlj7", UID:"d51cbc72-a254-4809-affa-a1f7ca5b3cc8", ResourceVersion:"8161", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63714563631, loc:(*time.Location)(0x7541d00)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"client.test", Operation:"Update", APIVersion:"v1", Time:(*v1.Time)(0xc0491dc120), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc0491dc140)}}}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"test", Image:"test-image", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"Always", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc04d24d968), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc0482870e0), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(nil), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration{v1.Toleration{Key:"node.kubernetes.io/not-ready", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc04d24d990)}, v1.Toleration{Key:"node.kubernetes.io/unreachable", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc04d24d9b0)}}, HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(0xc04d24d9b8), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(0xc04d24d9bc), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}, Status:v1.PodStatus{Phase:"Pending", Conditions:[]v1.PodCondition(nil), Message:"", Reason:"", NominatedNodeName:"", HostIP:"", PodIP:"", PodIPs:[]v1.PodIP(nil), StartTime:(*v1.Time)(nil), InitContainerStatuses:[]v1.ContainerStatus(nil), ContainerStatuses:[]v1.ContainerStatus(nil), QOSClass:"BestEffort", EphemeralContainerStatuses:[]v1.ContainerStatus(nil)}}, got &v1.Pod{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"testjhlj7", GenerateName:"test", Namespace:"default", SelfLink:"/api/v1/namespaces/default/pods/testjhlj7", UID:"d51cbc72-a254-4809-affa-a1f7ca5b3cc8", ResourceVersion:"8161", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63714563631, loc:(*time.Location)(0x7541d00)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"client.test", Operation:"Update", APIVersion:"v1", Time:(*v1.Time)(0xc0491dcee0), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc0491dcec0)}}}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"test", Image:"test-image", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"Always", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc04f8ca4b8), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc0483337a0), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(nil), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration{v1.Toleration{Key:"node.kubernetes.io/not-ready", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc04f8ca500)}, v1.Toleration{Key:"node.kubernetes.io/unreachable", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc04f8ca520)}}, HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(0xc04f8ca498), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(0xc04f8ca479), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}, Status:v1.PodStatus{Phase:"Pending", Conditions:[]v1.PodCondition(nil), Message:"", Reason:"", NominatedNodeName:"", HostIP:"", PodIP:"", PodIPs:[]v1.PodIP(nil), StartTime:(*v1.Time)(nil), InitContainerStatuses:[]v1.ContainerStatus(nil), ContainerStatuses:[]v1.ContainerStatus(nil), QOSClass:"BestEffort", EphemeralContainerStatuses:[]v1.ContainerStatus(nil)}}

				from junit_20200114-015127.xml

Find in mentions in log files | View test history on testgrid


Show 2610 Passed Tests

Show 4 Skipped Tests

Error lines from build-log.txt

... skipping 56 lines ...
Recording: record_command_canary
Running command: record_command_canary

+++ Running case: test-cmd.record_command_canary 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: record_command_canary
/home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh: line 155: bogus-expected-to-fail: command not found
!!! [0114 01:41:20] Call tree:
!!! [0114 01:41:20]  1: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:47 record_command_canary(...)
!!! [0114 01:41:20]  2: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:112 eVal(...)
!!! [0114 01:41:20]  3: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:131 juLog(...)
!!! [0114 01:41:20]  4: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:159 record_command(...)
!!! [0114 01:41:20]  5: hack/make-rules/test-cmd.sh:35 source(...)
+++ exit code: 1
+++ error: 1
+++ [0114 01:41:20] Running kubeadm tests
+++ [0114 01:41:26] Building go targets for linux/amd64:
    cmd/kubeadm
+++ [0114 01:42:12] Running tests without code coverage
{"Time":"2020-01-14T01:43:37.655913344Z","Action":"output","Package":"k8s.io/kubernetes/cmd/kubeadm/test/cmd","Output":"ok  \tk8s.io/kubernetes/cmd/kubeadm/test/cmd\t46.877s\n"}
✓  cmd/kubeadm/test/cmd (46.877s)
... skipping 302 lines ...
+++ [0114 01:45:30] Building kube-controller-manager
+++ [0114 01:45:36] Building go targets for linux/amd64:
    cmd/kube-controller-manager
+++ [0114 01:46:08] Starting controller-manager
Flag --port has been deprecated, see --secure-port instead.
I0114 01:46:09.292802   54459 serving.go:313] Generated self-signed cert in-memory
W0114 01:46:09.571597   54459 authentication.go:409] failed to read in-cluster kubeconfig for delegated authentication: open /var/run/secrets/kubernetes.io/serviceaccount/token: no such file or directory
W0114 01:46:09.571647   54459 authentication.go:267] No authentication-kubeconfig provided in order to lookup client-ca-file in configmap/extension-apiserver-authentication in kube-system, so client certificate authentication won't work.
W0114 01:46:09.571655   54459 authentication.go:291] No authentication-kubeconfig provided in order to lookup requestheader-client-ca-file in configmap/extension-apiserver-authentication in kube-system, so request-header client certificate authentication won't work.
W0114 01:46:09.571669   54459 authorization.go:177] failed to read in-cluster kubeconfig for delegated authorization: open /var/run/secrets/kubernetes.io/serviceaccount/token: no such file or directory
W0114 01:46:09.571701   54459 authorization.go:146] No authorization-kubeconfig provided, so SubjectAccessReview of authorization tokens won't work.
I0114 01:46:09.571734   54459 controllermanager.go:161] Version: v1.18.0-alpha.1.656+b008eda8b2dc0f
I0114 01:46:09.572770   54459 secure_serving.go:178] Serving securely on [::]:10257
I0114 01:46:09.572919   54459 tlsconfig.go:241] Starting DynamicServingCertificateController
I0114 01:46:09.573208   54459 deprecated_insecure_serving.go:53] Serving insecurely on [::]:10252
I0114 01:46:09.573286   54459 leaderelection.go:242] attempting to acquire leader lease  kube-system/kube-controller-manager...
... skipping 111 lines ...
I0114 01:46:10.113879   54459 daemon_controller.go:256] Starting daemon sets controller
I0114 01:46:10.113898   54459 shared_informer.go:206] Waiting for caches to sync for daemon sets
I0114 01:46:10.114115   54459 controllermanager.go:533] Started "statefulset"
I0114 01:46:10.114241   54459 stateful_set.go:145] Starting stateful set controller
I0114 01:46:10.114406   54459 shared_informer.go:206] Waiting for caches to sync for stateful set
I0114 01:46:10.114471   54459 node_lifecycle_controller.go:77] Sending events to api server
E0114 01:46:10.114513   54459 core.go:231] failed to start cloud node lifecycle controller: no cloud provider provided
W0114 01:46:10.114523   54459 controllermanager.go:525] Skipping "cloud-node-lifecycle"
W0114 01:46:10.114775   54459 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0114 01:46:10.114979   54459 controllermanager.go:533] Started "persistentvolume-expander"
I0114 01:46:10.115194   54459 expand_controller.go:319] Starting expand controller
I0114 01:46:10.115211   54459 shared_informer.go:206] Waiting for caches to sync for expand
I0114 01:46:10.115599   54459 controllermanager.go:533] Started "horizontalpodautoscaling"
I0114 01:46:10.115736   54459 horizontal.go:168] Starting HPA controller
I0114 01:46:10.115766   54459 shared_informer.go:206] Waiting for caches to sync for HPA
E0114 01:46:10.115953   54459 core.go:90] Failed to start service controller: WARNING: no cloud provider provided, services of type LoadBalancer will fail
W0114 01:46:10.115975   54459 controllermanager.go:525] Skipping "service"
I0114 01:46:10.116433   54459 controllermanager.go:533] Started "replicationcontroller"
I0114 01:46:10.116470   54459 replica_set.go:180] Starting replicationcontroller controller
I0114 01:46:10.116486   54459 shared_informer.go:206] Waiting for caches to sync for ReplicationController
I0114 01:46:10.116821   54459 controllermanager.go:533] Started "podgc"
I0114 01:46:10.117013   54459 gc_controller.go:88] Starting GC controller
... skipping 43 lines ...
I0114 01:46:10.626007   54459 controllermanager.go:533] Started "disruption"
W0114 01:46:10.626072   54459 controllermanager.go:525] Skipping "csrsigning"
I0114 01:46:10.626077   54459 disruption.go:330] Starting disruption controller
I0114 01:46:10.626096   54459 shared_informer.go:206] Waiting for caches to sync for disruption
+++ [0114 01:46:10] Testing kubectl version
I0114 01:46:10.658783   54459 shared_informer.go:213] Caches are synced for certificate-csrapproving 
W0114 01:46:10.667457   54459 actual_state_of_world.go:506] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="127.0.0.1" does not exist
I0114 01:46:10.712386   54459 shared_informer.go:213] Caches are synced for attach detach 
I0114 01:46:10.713276   54459 shared_informer.go:213] Caches are synced for PVC protection 
I0114 01:46:10.713743   54459 shared_informer.go:213] Caches are synced for endpoint 
I0114 01:46:10.715443   54459 shared_informer.go:213] Caches are synced for expand 
I0114 01:46:10.715973   54459 shared_informer.go:213] Caches are synced for HPA 
I0114 01:46:10.716665   54459 shared_informer.go:213] Caches are synced for ReplicationController 
I0114 01:46:10.717227   54459 shared_informer.go:213] Caches are synced for GC 
I0114 01:46:10.717548   54459 shared_informer.go:213] Caches are synced for job 
I0114 01:46:10.718042   54459 shared_informer.go:213] Caches are synced for ReplicaSet 
I0114 01:46:10.718418   54459 shared_informer.go:213] Caches are synced for ClusterRoleAggregator 
I0114 01:46:10.718926   54459 shared_informer.go:213] Caches are synced for persistent volume 
I0114 01:46:10.726217   54459 shared_informer.go:213] Caches are synced for disruption 
I0114 01:46:10.726242   54459 disruption.go:338] Sending events to api server.
E0114 01:46:10.726833   54459 clusterroleaggregation_controller.go:180] view failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "view": the object has been modified; please apply your changes to the latest version and try again
E0114 01:46:10.726838   54459 clusterroleaggregation_controller.go:180] edit failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "edit": the object has been modified; please apply your changes to the latest version and try again
E0114 01:46:10.734211   54459 clusterroleaggregation_controller.go:180] edit failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "edit": the object has been modified; please apply your changes to the latest version and try again
{
  "major": "1",
  "minor": "18+",
  "gitVersion": "v1.18.0-alpha.1.656+b008eda8b2dc0f",
  "gitCommit": "b008eda8b2dc0fdd884cb56065f7f1667a970a16",
  "gitTreeState": "clean",
... skipping 80 lines ...
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_RESTMapper_evaluation_tests
+++ [0114 01:46:14] Creating namespace namespace-1578966374-4328
namespace/namespace-1578966374-4328 created
Context "test" modified.
+++ [0114 01:46:14] Testing RESTMapper
+++ [0114 01:46:14] "kubectl get unknownresourcetype" returns error as expected: error: the server doesn't have a resource type "unknownresourcetype"
+++ exit code: 0
NAME                              SHORTNAMES   APIGROUP                       NAMESPACED   KIND
bindings                                                                      true         Binding
componentstatuses                 cs                                          false        ComponentStatus
configmaps                        cm                                          true         ConfigMap
endpoints                         ep                                          true         Endpoints
... skipping 650 lines ...
has:valid-pod
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
has:valid-pod
core.sh:186: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Berror: resource(s) were provided, but no name, label selector, or --all flag specified
core.sh:190: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:194: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Berror: setting 'all' parameter but found a non empty selector. 
core.sh:198: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
core.sh:206: Successful get pods -l'name in (valid-pod)' {{range.items}}{{.metadata.name}}:{{end}}: 
(Bcore.sh:211: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-kubectl-describe-pod\" }}found{{end}}{{end}}:: :
... skipping 12 lines ...
(Bpoddisruptionbudget.policy/test-pdb-2 created
core.sh:245: Successful get pdb/test-pdb-2 --namespace=test-kubectl-describe-pod {{.spec.minAvailable}}: 50%
(Bpoddisruptionbudget.policy/test-pdb-3 created
core.sh:251: Successful get pdb/test-pdb-3 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 2
(Bpoddisruptionbudget.policy/test-pdb-4 created
core.sh:255: Successful get pdb/test-pdb-4 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 50%
(Berror: min-available and max-unavailable cannot be both specified
core.sh:261: Successful get pods --namespace=test-kubectl-describe-pod {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/env-test-pod created
matched TEST_CMD_1
matched <set to the key 'key-1' in secret 'test-secret'>
matched TEST_CMD_2
matched <set to the key 'key-2' of config map 'test-configmap'>
... skipping 188 lines ...
(Bpod/valid-pod patched
core.sh:470: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: changed-with-yaml:
(Bpod/valid-pod patched
core.sh:475: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:3.1:
(Bpod/valid-pod patched
core.sh:491: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
(B+++ [0114 01:46:54] "kubectl patch with resourceVersion 526" returns error as expected: Error from server (Conflict): Operation cannot be fulfilled on pods "valid-pod": the object has been modified; please apply your changes to the latest version and try again
pod "valid-pod" deleted
pod/valid-pod replaced
core.sh:515: Successful get pod valid-pod {{(index .spec.containers 0).name}}: replaced-k8s-serve-hostname
(BSuccessful
message:error: --grace-period must have --force specified
has:\-\-grace-period must have \-\-force specified
Successful
message:error: --timeout must have --force specified
has:\-\-timeout must have \-\-force specified
W0114 01:46:55.502149   54459 actual_state_of_world.go:506] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="node-v1-test" does not exist
node/node-v1-test created
node/node-v1-test replaced
I0114 01:46:55.757232   54459 event.go:278] Event(v1.ObjectReference{Kind:"Node", Namespace:"", Name:"node-v1-test", UID:"c0b5d482-aeb0-4967-a2f0-1bd8ccc3cb9f", APIVersion:"", ResourceVersion:"", FieldPath:""}): type: 'Normal' reason: 'RegisteredNode' Node node-v1-test event: Registered Node node-v1-test in Controller
core.sh:552: Successful get node node-v1-test {{.metadata.annotations.a}}: b
(Bnode "node-v1-test" deleted
core.sh:559: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
... skipping 24 lines ...
spec:
  containers:
  - image: k8s.gcr.io/pause:2.0
    name: kubernetes-pause
has:localonlyvalue
core.sh:585: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Berror: 'name' already has a value (valid-pod), and --overwrite is false
core.sh:589: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Bcore.sh:593: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Bpod/valid-pod labeled
core.sh:597: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod-super-sayan
(Bcore.sh:601: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
... skipping 86 lines ...
+++ Running case: test-cmd.run_kubectl_create_error_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_create_error_tests
+++ [0114 01:47:06] Creating namespace namespace-1578966426-5893
namespace/namespace-1578966426-5893 created
Context "test" modified.
+++ [0114 01:47:06] Testing kubectl create with error
Error: must specify one of -f and -k

Create a resource from a file or from stdin.

 JSON and YAML formats are accepted.

Examples:
... skipping 41 lines ...

Usage:
  kubectl create -f FILENAME [options]

Use "kubectl <command> --help" for more information about a given command.
Use "kubectl options" for a list of global command-line options (applies to all commands).
+++ [0114 01:47:06] "kubectl create with empty string list returns error as expected: error: error validating "hack/testdata/invalid-rc-with-empty-args.yaml": error validating data: ValidationError(ReplicationController.spec.template.spec.containers[0].args): unknown object type "nil" in ReplicationController.spec.template.spec.containers[0].args[0]; if you choose to ignore these errors, turn validation off with --validate=false
kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
+++ exit code: 0
Recording: run_kubectl_apply_tests
Running command: run_kubectl_apply_tests

... skipping 17 lines ...
(Bpod "test-pod" deleted
customresourcedefinition.apiextensions.k8s.io/resources.mygroup.example.com created
I0114 01:47:09.698183   50985 client.go:361] parsed scheme: "endpoint"
I0114 01:47:09.698226   50985 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:47:09.703142   50985 controller.go:606] quota admission added evaluator for: resources.mygroup.example.com
kind.mygroup.example.com/myobj serverside-applied (server dry run)
Error from server (NotFound): resources.mygroup.example.com "myobj" not found
customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
+++ exit code: 0
Recording: run_kubectl_run_tests
Running command: run_kubectl_run_tests

+++ Running case: test-cmd.run_kubectl_run_tests 
... skipping 104 lines ...
Context "test" modified.
+++ [0114 01:47:13] Testing kubectl create filter
create.sh:30: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/selector-test-pod created
create.sh:34: Successful get pods selector-test-pod {{.metadata.labels.name}}: selector-test-pod
(BSuccessful
message:Error from server (NotFound): pods "selector-test-pod-dont-apply" not found
has:pods "selector-test-pod-dont-apply" not found
pod "selector-test-pod" deleted
+++ exit code: 0
Recording: run_kubectl_apply_deployments_tests
Running command: run_kubectl_apply_deployments_tests

... skipping 30 lines ...
I0114 01:47:16.934809   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966434-18133", Name:"nginx-8484dd655", UID:"59abee67-f7e9-4327-bef7-143e1cc57282", APIVersion:"apps/v1", ResourceVersion:"623", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-8484dd655-dn6vf
I0114 01:47:16.937833   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966434-18133", Name:"nginx-8484dd655", UID:"59abee67-f7e9-4327-bef7-143e1cc57282", APIVersion:"apps/v1", ResourceVersion:"623", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-8484dd655-zd4qq
I0114 01:47:16.940633   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966434-18133", Name:"nginx-8484dd655", UID:"59abee67-f7e9-4327-bef7-143e1cc57282", APIVersion:"apps/v1", ResourceVersion:"623", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-8484dd655-zn2wt
apps.sh:148: Successful get deployment nginx {{.metadata.name}}: nginx
(BI0114 01:47:20.575043   54459 horizontal.go:353] Horizontal Pod Autoscaler frontend has been deleted in namespace-1578966423-23161
Successful
message:Error from server (Conflict): error when applying patch:
{"metadata":{"annotations":{"kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1578966434-18133\",\"resourceVersion\":\"99\"},\"spec\":{\"replicas\":3,\"selector\":{\"matchLabels\":{\"name\":\"nginx2\"}},\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx2\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n"},"resourceVersion":"99"},"spec":{"selector":{"matchLabels":{"name":"nginx2"}},"template":{"metadata":{"labels":{"name":"nginx2"}}}}}
to:
Resource: "apps/v1, Resource=deployments", GroupVersionKind: "apps/v1, Kind=Deployment"
Name: "nginx", Namespace: "namespace-1578966434-18133"
for: "hack/testdata/deployment-label-change2.yaml": Operation cannot be fulfilled on deployments.apps "nginx": the object has been modified; please apply your changes to the latest version and try again
has:Error from server (Conflict)
E0114 01:47:25.625361   54459 replica_set.go:534] sync "namespace-1578966434-18133/nginx-8484dd655" failed with Operation cannot be fulfilled on replicasets.apps "nginx-8484dd655": StorageError: invalid object, Code: 4, Key: /registry/replicasets/namespace-1578966434-18133/nginx-8484dd655, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 59abee67-f7e9-4327-bef7-143e1cc57282, UID in object meta: 
deployment.apps/nginx configured
I0114 01:47:26.601434   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966434-18133", Name:"nginx", UID:"fc414a0f-518a-4707-bacb-a78532eef6eb", APIVersion:"apps/v1", ResourceVersion:"663", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-668b6c7744 to 3
I0114 01:47:26.605991   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966434-18133", Name:"nginx-668b6c7744", UID:"8354eacb-5543-4e58-8a09-0d0f3befa7d2", APIVersion:"apps/v1", ResourceVersion:"664", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-668b6c7744-hbpwk
I0114 01:47:26.608881   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966434-18133", Name:"nginx-668b6c7744", UID:"8354eacb-5543-4e58-8a09-0d0f3befa7d2", APIVersion:"apps/v1", ResourceVersion:"664", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-668b6c7744-2lkc2
I0114 01:47:26.610980   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966434-18133", Name:"nginx-668b6c7744", UID:"8354eacb-5543-4e58-8a09-0d0f3befa7d2", APIVersion:"apps/v1", ResourceVersion:"664", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-668b6c7744-lmw5s
Successful
... skipping 141 lines ...
+++ [0114 01:47:34] Creating namespace namespace-1578966454-14347
namespace/namespace-1578966454-14347 created
Context "test" modified.
+++ [0114 01:47:34] Testing kubectl get
get.sh:29: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
get.sh:37: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
get.sh:45: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:{
    "apiVersion": "v1",
    "items": [],
... skipping 23 lines ...
has not:No resources found
Successful
message:NAME
has not:No resources found
get.sh:73: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:error: the server doesn't have a resource type "foobar"
has not:No resources found
Successful
message:No resources found in namespace-1578966454-14347 namespace.
has:No resources found
Successful
message:
has not:No resources found
Successful
message:No resources found in namespace-1578966454-14347 namespace.
has:No resources found
get.sh:93: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
Successful
message:Error from server (NotFound): pods "abc" not found
has not:List
Successful
message:I0114 01:47:36.227927   65015 loader.go:375] Config loaded from file:  /tmp/tmp.PaCMxnunqg/.kube/config
I0114 01:47:36.229360   65015 round_trippers.go:443] GET http://127.0.0.1:8080/version?timeout=32s 200 OK in 1 milliseconds
I0114 01:47:36.254604   65015 round_trippers.go:443] GET http://127.0.0.1:8080/api/v1/namespaces/default/pods 200 OK in 1 milliseconds
I0114 01:47:36.256443   65015 round_trippers.go:443] GET http://127.0.0.1:8080/api/v1/namespaces/default/replicationcontrollers 200 OK in 1 milliseconds
... skipping 479 lines ...
Successful
message:NAME    DATA   AGE
one     0      0s
three   0      0s
two     0      0s
STATUS    REASON          MESSAGE
Failure   InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has not:watch is only supported on individual resources
Successful
message:STATUS    REASON          MESSAGE
Failure   InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has not:watch is only supported on individual resources
+++ [0114 01:47:43] Creating namespace namespace-1578966463-6972
namespace/namespace-1578966463-6972 created
Context "test" modified.
get.sh:153: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/valid-pod created
... skipping 56 lines ...
}
get.sh:158: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(B<no value>Successful
message:valid-pod:
has:valid-pod:
Successful
message:error: error executing jsonpath "{.missing}": Error executing template: missing is not found. Printing more information for debugging the template:
	template was:
		{.missing}
	object given to jsonpath engine was:
		map[string]interface {}{"apiVersion":"v1", "kind":"Pod", "metadata":map[string]interface {}{"creationTimestamp":"2020-01-14T01:47:43Z", "labels":map[string]interface {}{"name":"valid-pod"}, "name":"valid-pod", "namespace":"namespace-1578966463-6972", "resourceVersion":"750", "selfLink":"/api/v1/namespaces/namespace-1578966463-6972/pods/valid-pod", "uid":"fa2ce50c-180f-4e3f-a364-bafef0e4c9f6"}, "spec":map[string]interface {}{"containers":[]interface {}{map[string]interface {}{"image":"k8s.gcr.io/serve_hostname", "imagePullPolicy":"Always", "name":"kubernetes-serve-hostname", "resources":map[string]interface {}{"limits":map[string]interface {}{"cpu":"1", "memory":"512Mi"}, "requests":map[string]interface {}{"cpu":"1", "memory":"512Mi"}}, "terminationMessagePath":"/dev/termination-log", "terminationMessagePolicy":"File"}}, "dnsPolicy":"ClusterFirst", "enableServiceLinks":true, "priority":0, "restartPolicy":"Always", "schedulerName":"default-scheduler", "securityContext":map[string]interface {}{}, "terminationGracePeriodSeconds":30}, "status":map[string]interface {}{"phase":"Pending", "qosClass":"Guaranteed"}}
has:missing is not found
error: error executing template "{{.missing}}": template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing"
Successful
message:Error executing template: template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing". Printing more information for debugging the template:
	template was:
		{{.missing}}
	raw data was:
		{"apiVersion":"v1","kind":"Pod","metadata":{"creationTimestamp":"2020-01-14T01:47:43Z","labels":{"name":"valid-pod"},"name":"valid-pod","namespace":"namespace-1578966463-6972","resourceVersion":"750","selfLink":"/api/v1/namespaces/namespace-1578966463-6972/pods/valid-pod","uid":"fa2ce50c-180f-4e3f-a364-bafef0e4c9f6"},"spec":{"containers":[{"image":"k8s.gcr.io/serve_hostname","imagePullPolicy":"Always","name":"kubernetes-serve-hostname","resources":{"limits":{"cpu":"1","memory":"512Mi"},"requests":{"cpu":"1","memory":"512Mi"}},"terminationMessagePath":"/dev/termination-log","terminationMessagePolicy":"File"}],"dnsPolicy":"ClusterFirst","enableServiceLinks":true,"priority":0,"restartPolicy":"Always","schedulerName":"default-scheduler","securityContext":{},"terminationGracePeriodSeconds":30},"status":{"phase":"Pending","qosClass":"Guaranteed"}}
	object given to template engine was:
		map[apiVersion:v1 kind:Pod metadata:map[creationTimestamp:2020-01-14T01:47:43Z labels:map[name:valid-pod] name:valid-pod namespace:namespace-1578966463-6972 resourceVersion:750 selfLink:/api/v1/namespaces/namespace-1578966463-6972/pods/valid-pod uid:fa2ce50c-180f-4e3f-a364-bafef0e4c9f6] spec:map[containers:[map[image:k8s.gcr.io/serve_hostname imagePullPolicy:Always name:kubernetes-serve-hostname resources:map[limits:map[cpu:1 memory:512Mi] requests:map[cpu:1 memory:512Mi]] terminationMessagePath:/dev/termination-log terminationMessagePolicy:File]] dnsPolicy:ClusterFirst enableServiceLinks:true priority:0 restartPolicy:Always schedulerName:default-scheduler securityContext:map[] terminationGracePeriodSeconds:30] status:map[phase:Pending qosClass:Guaranteed]]
has:map has no entry for key "missing"
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
STATUS      REASON          MESSAGE
Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has:STATUS
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
STATUS      REASON          MESSAGE
Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has:valid-pod
Successful
message:pod/valid-pod
status/<unknown>
has not:STATUS
Successful
... skipping 45 lines ...
      (Client.Timeout exceeded while reading body)'
    reason: UnexpectedServerResponse
  - message: 'unable to decode an event from the watch stream: net/http: request canceled
      (Client.Timeout exceeded while reading body)'
    reason: ClientWatchDecoding
kind: Status
message: 'an error on the server ("unable to decode an event from the watch stream:
  net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented
  the request from succeeding'
metadata: {}
reason: InternalError
status: Failure
has not:STATUS
... skipping 42 lines ...
      (Client.Timeout exceeded while reading body)'
    reason: UnexpectedServerResponse
  - message: 'unable to decode an event from the watch stream: net/http: request canceled
      (Client.Timeout exceeded while reading body)'
    reason: ClientWatchDecoding
kind: Status
message: 'an error on the server ("unable to decode an event from the watch stream:
  net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented
  the request from succeeding'
metadata: {}
reason: InternalError
status: Failure
has:name: valid-pod
Successful
message:Error from server (NotFound): pods "invalid-pod" not found
has:"invalid-pod" not found
pod "valid-pod" deleted
get.sh:196: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/redis-master created
pod/valid-pod created
Successful
... skipping 35 lines ...
+++ command: run_kubectl_exec_pod_tests
+++ [0114 01:47:49] Creating namespace namespace-1578966469-30497
namespace/namespace-1578966469-30497 created
Context "test" modified.
+++ [0114 01:47:49] Testing kubectl exec POD COMMAND
Successful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
pod/test-pod created
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pods "test-pod" not found
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pod or type/name must be specified
pod "test-pod" deleted
+++ exit code: 0
Recording: run_kubectl_exec_resource_name_tests
Running command: run_kubectl_exec_resource_name_tests

... skipping 2 lines ...
+++ command: run_kubectl_exec_resource_name_tests
+++ [0114 01:47:50] Creating namespace namespace-1578966470-20309
namespace/namespace-1578966470-20309 created
Context "test" modified.
+++ [0114 01:47:50] Testing kubectl exec TYPE/NAME COMMAND
Successful
message:error: the server doesn't have a resource type "foo"
has:error:
Successful
message:Error from server (NotFound): deployments.apps "bar" not found
has:"bar" not found
pod/test-pod created
replicaset.apps/frontend created
I0114 01:47:51.302824   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966470-20309", Name:"frontend", UID:"6a3454a7-72ab-4a31-8ed0-98f2b0a84e37", APIVersion:"apps/v1", ResourceVersion:"808", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-4rdvh
I0114 01:47:51.306214   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966470-20309", Name:"frontend", UID:"6a3454a7-72ab-4a31-8ed0-98f2b0a84e37", APIVersion:"apps/v1", ResourceVersion:"808", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-chn6h
I0114 01:47:51.306947   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966470-20309", Name:"frontend", UID:"6a3454a7-72ab-4a31-8ed0-98f2b0a84e37", APIVersion:"apps/v1", ResourceVersion:"808", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-pk48p
configmap/test-set-env-config created
Successful
message:error: cannot attach to *v1.ConfigMap: selector for *v1.ConfigMap not implemented
has:not implemented
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:not found
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pod or type/name must be specified
Successful
message:Error from server (BadRequest): pod frontend-4rdvh does not have a host assigned
has not:not found
Successful
message:Error from server (BadRequest): pod frontend-4rdvh does not have a host assigned
has not:pod or type/name must be specified
pod "test-pod" deleted
replicaset.apps "frontend" deleted
configmap "test-set-env-config" deleted
+++ exit code: 0
Recording: run_create_secret_tests
Running command: run_create_secret_tests

+++ Running case: test-cmd.run_create_secret_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_create_secret_tests
Successful
message:Error from server (NotFound): secrets "mysecret" not found
has:secrets "mysecret" not found
Successful
message:Error from server (NotFound): secrets "mysecret" not found
has:secrets "mysecret" not found
Successful
message:user-specified
has:user-specified
Successful
{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","selfLink":"/api/v1/namespaces/default/configmaps/tester-update-cm","uid":"5d7c0ade-bb02-4e38-ae3b-7fb3a4bae172","resourceVersion":"830","creationTimestamp":"2020-01-14T01:47:52Z"}}
... skipping 2 lines ...
has:uid
Successful
message:{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","selfLink":"/api/v1/namespaces/default/configmaps/tester-update-cm","uid":"5d7c0ade-bb02-4e38-ae3b-7fb3a4bae172","resourceVersion":"831","creationTimestamp":"2020-01-14T01:47:52Z"},"data":{"key1":"config1"}}
has:config1
{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Success","details":{"name":"tester-update-cm","kind":"configmaps","uid":"5d7c0ade-bb02-4e38-ae3b-7fb3a4bae172"}}
Successful
message:Error from server (NotFound): configmaps "tester-update-cm" not found
has:configmaps "tester-update-cm" not found
+++ exit code: 0
Recording: run_kubectl_create_kustomization_directory_tests
Running command: run_kubectl_create_kustomization_directory_tests

+++ Running case: test-cmd.run_kubectl_create_kustomization_directory_tests 
... skipping 159 lines ...
valid-pod   0/1     Pending   0          0s
has:valid-pod
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          0s
STATUS      REASON          MESSAGE
Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has:Timeout exceeded while reading body
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          2s
has:valid-pod
Successful
message:error: Invalid timeout value. Timeout must be a single integer in seconds, or an integer followed by a corresponding time unit (e.g. 1s | 2m | 3h)
has:Invalid timeout value
pod "valid-pod" deleted
+++ exit code: 0
Recording: run_crd_tests
Running command: run_crd_tests

... skipping 158 lines ...
foo.company.com/test patched
crd.sh:236: Successful get foos/test {{.patched}}: value1
(Bfoo.company.com/test patched
crd.sh:238: Successful get foos/test {{.patched}}: value2
(Bfoo.company.com/test patched
crd.sh:240: Successful get foos/test {{.patched}}: <no value>
(B+++ [0114 01:48:03] "kubectl patch --local" returns error as expected for CustomResource: error: cannot apply strategic merge patch for company.com/v1, Kind=Foo locally, try --type merge
{
    "apiVersion": "company.com/v1",
    "kind": "Foo",
    "metadata": {
        "annotations": {
            "kubernetes.io/change-cause": "kubectl patch foos/test --server=http://127.0.0.1:8080 --match-server-version=true --patch={\"patched\":null} --type=merge --record=true"
... skipping 193 lines ...
(Bcrd.sh:450: Successful get bars {{range.items}}{{.metadata.name}}:{{end}}: 
(Bnamespace/non-native-resources created
bar.company.com/test created
crd.sh:455: Successful get bars {{len .items}}: 1
(Bnamespace "non-native-resources" deleted
crd.sh:458: Successful get bars {{len .items}}: 0
(BError from server (NotFound): namespaces "non-native-resources" not found
customresourcedefinition.apiextensions.k8s.io "foos.company.com" deleted
customresourcedefinition.apiextensions.k8s.io "bars.company.com" deleted
customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
customresourcedefinition.apiextensions.k8s.io "validfoos.company.com" deleted
+++ exit code: 0
Recording: run_cmd_with_img_tests
... skipping 11 lines ...
I0114 01:48:27.024526   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966506-17220", Name:"test1-6cdffdb5b8", UID:"d90f7604-cd17-4511-b482-b2d5c99ed2a8", APIVersion:"apps/v1", ResourceVersion:"1000", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test1-6cdffdb5b8-l28nh
Successful
message:deployment.apps/test1 created
has:deployment.apps/test1 created
deployment.apps "test1" deleted
W0114 01:48:27.225313   50985 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E0114 01:48:27.226688   54459 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: Invalid image name "InvalidImageName": invalid reference format
has:error: Invalid image name "InvalidImageName": invalid reference format
+++ exit code: 0
+++ [0114 01:48:27] Testing recursive resources
+++ [0114 01:48:27] Creating namespace namespace-1578966507-25145
W0114 01:48:27.330638   50985 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E0114 01:48:27.331846   54459 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578966507-25145 created
W0114 01:48:27.437696   50985 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E0114 01:48:27.439057   54459 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
W0114 01:48:27.555413   50985 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E0114 01:48:27.556624   54459 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:206: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:pod/busybox0 created
pod/busybox1 created
error: error validating "hack/testdata/recursive/pod/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
generic-resources.sh:211: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 01:48:28.228041   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:220: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: busybox:busybox:
(BSuccessful
message:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
E0114 01:48:28.333227   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:227: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 01:48:28.440361   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:28.557765   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:231: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
(BSuccessful
message:pod/busybox0 replaced
pod/busybox1 replaced
error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
generic-resources.sh:236: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:Name:         busybox0
Namespace:    namespace-1578966507-25145
Priority:     0
Node:         <none>
... skipping 155 lines ...
Node-Selectors:   <none>
Tolerations:      <none>
Events:           <none>
unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:246: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 01:48:29.229042   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:250: Successful get pods {{range.items}}{{.metadata.annotations.annotatekey}}:{{end}}: annotatevalue:annotatevalue:
(BSuccessful
message:pod/busybox0 annotated
pod/busybox1 annotated
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
E0114 01:48:29.334673   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:255: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 01:48:29.441456   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:29.559075   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:259: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
(BSuccessful
message:error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
generic-resources.sh:265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx created
I0114 01:48:30.012380   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966507-25145", Name:"nginx", UID:"b9714c34-8ff7-473e-bb82-0a757d655be8", APIVersion:"apps/v1", ResourceVersion:"1024", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-f87d999f7 to 3
I0114 01:48:30.016032   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966507-25145", Name:"nginx-f87d999f7", UID:"69dce236-353c-4000-b414-7d05620f71e3", APIVersion:"apps/v1", ResourceVersion:"1025", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-z9tx4
I0114 01:48:30.019456   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966507-25145", Name:"nginx-f87d999f7", UID:"69dce236-353c-4000-b414-7d05620f71e3", APIVersion:"apps/v1", ResourceVersion:"1025", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-8bnfg
I0114 01:48:30.021267   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966507-25145", Name:"nginx-f87d999f7", UID:"69dce236-353c-4000-b414-7d05620f71e3", APIVersion:"apps/v1", ResourceVersion:"1025", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-xqnbz
generic-resources.sh:269: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(Bgeneric-resources.sh:270: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0114 01:48:30.230204   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
E0114 01:48:30.335877   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:274: Successful get deployment nginx {{ .apiVersion }}: apps/v1
(BSuccessful
message:apiVersion: extensions/v1beta1
kind: Deployment
metadata:
  creationTimestamp: null
... skipping 32 lines ...
      restartPolicy: Always
      schedulerName: default-scheduler
      securityContext: {}
      terminationGracePeriodSeconds: 30
status: {}
has:extensions/v1beta1
E0114 01:48:30.442810   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx" deleted
E0114 01:48:30.560277   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:281: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:285: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:290: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BI0114 01:48:30.918197   54459 namespace_controller.go:185] Namespace has been deleted non-native-resources
Successful
message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:busybox0:busybox1:
Successful
message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:299: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bpod/busybox0 labeled
pod/busybox1 labeled
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
E0114 01:48:31.231462   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:304: Successful get pods {{range.items}}{{.metadata.labels.mylabel}}:{{end}}: myvalue:myvalue:
(BSuccessful
message:pod/busybox0 labeled
pod/busybox1 labeled
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
E0114 01:48:31.337131   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:309: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 01:48:31.444103   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/busybox0 patched
pod/busybox1 patched
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
E0114 01:48:31.561886   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:314: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: prom/busybox:prom/busybox:
(BSuccessful
message:pod/busybox0 patched
pod/busybox1 patched
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:319: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:323: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "busybox0" force deleted
pod "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:328: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/busybox0 created
I0114 01:48:32.166156   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966507-25145", Name:"busybox0", UID:"a95c7b82-1a13-4db8-a29c-0414ed989dd1", APIVersion:"v1", ResourceVersion:"1056", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-2xb7t
replicationcontroller/busybox1 created
error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0114 01:48:32.171243   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966507-25145", Name:"busybox1", UID:"452f9403-2dbf-48d1-90be-2763cc7234ad", APIVersion:"v1", ResourceVersion:"1058", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-q778r
E0114 01:48:32.232677   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:332: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 01:48:32.338557   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:337: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 01:48:32.445651   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:338: Successful get rc busybox0 {{.spec.replicas}}: 1
(BE0114 01:48:32.563445   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:339: Successful get rc busybox1 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:344: Successful get hpa busybox0 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
(Bgeneric-resources.sh:345: Successful get hpa busybox1 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
(BSuccessful
message:horizontalpodautoscaler.autoscaling/busybox0 autoscaled
horizontalpodautoscaler.autoscaling/busybox1 autoscaled
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
horizontalpodautoscaler.autoscaling "busybox0" deleted
horizontalpodautoscaler.autoscaling "busybox1" deleted
generic-resources.sh:353: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 01:48:33.233812   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:354: Successful get rc busybox0 {{.spec.replicas}}: 1
(BE0114 01:48:33.339714   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:355: Successful get rc busybox1 {{.spec.replicas}}: 1
(BE0114 01:48:33.446911   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:359: Successful get service busybox0 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(BE0114 01:48:33.564710   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:360: Successful get service busybox1 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(BSuccessful
message:service/busybox0 exposed
service/busybox1 exposed
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
generic-resources.sh:366: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:367: Successful get rc busybox0 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:368: Successful get rc busybox1 {{.spec.replicas}}: 1
(BI0114 01:48:34.059760   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966507-25145", Name:"busybox0", UID:"a95c7b82-1a13-4db8-a29c-0414ed989dd1", APIVersion:"v1", ResourceVersion:"1078", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-mqwh6
I0114 01:48:34.082847   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966507-25145", Name:"busybox1", UID:"452f9403-2dbf-48d1-90be-2763cc7234ad", APIVersion:"v1", ResourceVersion:"1082", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-kl2dc
generic-resources.sh:372: Successful get rc busybox0 {{.spec.replicas}}: 2
(BE0114 01:48:34.235224   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:373: Successful get rc busybox1 {{.spec.replicas}}: 2
(BSuccessful
message:replicationcontroller/busybox0 scaled
replicationcontroller/busybox1 scaled
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
E0114 01:48:34.340971   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:378: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 01:48:34.448353   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:34.565914   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:382: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
replicationcontroller "busybox0" force deleted
replicationcontroller "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
generic-resources.sh:387: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx1-deployment created
I0114 01:48:34.882161   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966507-25145", Name:"nginx1-deployment", UID:"06d07341-562a-4179-beb3-781bca271255", APIVersion:"apps/v1", ResourceVersion:"1100", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx1-deployment-7bdbbfb5cf to 2
deployment.apps/nginx0-deployment created
error: error validating "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0114 01:48:34.885957   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966507-25145", Name:"nginx0-deployment", UID:"67da0671-e7ab-455a-bdf8-7f4ad508e6c8", APIVersion:"apps/v1", ResourceVersion:"1102", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx0-deployment-57c6bff7f6 to 2
I0114 01:48:34.892165   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966507-25145", Name:"nginx1-deployment-7bdbbfb5cf", UID:"232e08fd-e130-492c-93b5-5193005e4ac9", APIVersion:"apps/v1", ResourceVersion:"1101", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-7bdbbfb5cf-k5797
I0114 01:48:34.892208   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966507-25145", Name:"nginx0-deployment-57c6bff7f6", UID:"71b97f42-8244-4a81-9977-a634f50f35d7", APIVersion:"apps/v1", ResourceVersion:"1105", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-57c6bff7f6-kth4q
I0114 01:48:34.899352   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966507-25145", Name:"nginx1-deployment-7bdbbfb5cf", UID:"232e08fd-e130-492c-93b5-5193005e4ac9", APIVersion:"apps/v1", ResourceVersion:"1101", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-7bdbbfb5cf-hjpwd
I0114 01:48:34.899470   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966507-25145", Name:"nginx0-deployment-57c6bff7f6", UID:"71b97f42-8244-4a81-9977-a634f50f35d7", APIVersion:"apps/v1", ResourceVersion:"1105", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-57c6bff7f6-rrk6v
generic-resources.sh:391: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx0-deployment:nginx1-deployment:
(Bgeneric-resources.sh:392: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
(BE0114 01:48:35.236579   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:396: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
(BSuccessful
message:deployment.apps/nginx1-deployment skipped rollback (current template already matches revision 1)
deployment.apps/nginx0-deployment skipped rollback (current template already matches revision 1)
error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
E0114 01:48:35.342073   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx1-deployment paused
deployment.apps/nginx0-deployment paused
E0114 01:48:35.449663   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:404: Successful get deployment {{range.items}}{{.spec.paused}}:{{end}}: true:true:
(BSuccessful
message:unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
E0114 01:48:35.567255   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx1-deployment resumed
deployment.apps/nginx0-deployment resumed
generic-resources.sh:410: Successful get deployment {{range.items}}{{.spec.paused}}:{{end}}: <no value>:<no value>:
(BSuccessful
message:unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
... skipping 3 lines ...
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:nginx0-deployment
Successful
message:deployment.apps/nginx1-deployment 
REVISION  CHANGE-CAUSE
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:nginx1-deployment
Successful
message:deployment.apps/nginx1-deployment 
REVISION  CHANGE-CAUSE
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
deployment.apps "nginx1-deployment" force deleted
deployment.apps "nginx0-deployment" force deleted
error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
E0114 01:48:36.237848   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:36.343552   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:36.451135   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:36.568445   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:426: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 01:48:37.239322   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/busybox0 created
I0114 01:48:37.287019   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966507-25145", Name:"busybox0", UID:"5e117fd3-fa9a-410a-bf1f-28954099923d", APIVersion:"v1", ResourceVersion:"1150", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-6d46l
replicationcontroller/busybox1 created
error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0114 01:48:37.292355   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966507-25145", Name:"busybox1", UID:"6b8de3f2-ec8b-4c91-a276-7952c210a383", APIVersion:"v1", ResourceVersion:"1152", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-mgdvr
E0114 01:48:37.344727   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:430: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 01:48:37.452440   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:no rollbacker has been implemented for "ReplicationController"
no rollbacker has been implemented for "ReplicationController"
unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:no rollbacker has been implemented for "ReplicationController"
Successful
message:no rollbacker has been implemented for "ReplicationController"
no rollbacker has been implemented for "ReplicationController"
unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
E0114 01:48:37.569698   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:replicationcontrollers "busybox0" pausing is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:replicationcontrollers "busybox1" pausing is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:replicationcontrollers "busybox0" resuming is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:replicationcontrollers "busybox1" resuming is not supported
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
replicationcontroller "busybox0" force deleted
replicationcontroller "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
E0114 01:48:38.240632   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:38.345983   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:38.453678   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:38.571028   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_namespace_tests
Running command: run_namespace_tests

+++ Running case: test-cmd.run_namespace_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_namespace_tests
+++ [0114 01:48:38] Testing kubectl(v1:namespaces)
namespace/my-namespace created
core.sh:1314: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
(Bnamespace "my-namespace" deleted
E0114 01:48:39.241843   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:39.347438   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:39.455063   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:39.572381   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:40.243318   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:40.348692   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:40.456316   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:40.573786   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:41.244556   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:41.350003   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:41.457524   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:41.575248   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:42.245930   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:42.351486   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:42.458972   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:42.576706   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:43.247333   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 01:48:43.316906   54459 shared_informer.go:206] Waiting for caches to sync for resource quota
I0114 01:48:43.316967   54459 shared_informer.go:213] Caches are synced for resource quota 
E0114 01:48:43.352754   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:43.460209   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:43.578056   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 01:48:43.834164   54459 shared_informer.go:206] Waiting for caches to sync for garbage collector
I0114 01:48:43.834226   54459 shared_informer.go:213] Caches are synced for garbage collector 
E0114 01:48:44.248392   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/my-namespace condition met
E0114 01:48:44.354014   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Error from server (NotFound): namespaces "my-namespace" not found
has: not found
namespace/my-namespace created
E0114 01:48:44.461425   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1323: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
(BE0114 01:48:44.579466   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
namespace "kube-node-lease" deleted
namespace "my-namespace" deleted
namespace "namespace-1578966371-26714" deleted
namespace "namespace-1578966374-4328" deleted
... skipping 26 lines ...
namespace "namespace-1578966474-3858" deleted
namespace "namespace-1578966475-13374" deleted
namespace "namespace-1578966477-5828" deleted
namespace "namespace-1578966479-20132" deleted
namespace "namespace-1578966506-17220" deleted
namespace "namespace-1578966507-25145" deleted
Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
has:warning: deleting cluster-scoped resources
Successful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
namespace "kube-node-lease" deleted
namespace "my-namespace" deleted
namespace "namespace-1578966371-26714" deleted
... skipping 27 lines ...
namespace "namespace-1578966474-3858" deleted
namespace "namespace-1578966475-13374" deleted
namespace "namespace-1578966477-5828" deleted
namespace "namespace-1578966479-20132" deleted
namespace "namespace-1578966506-17220" deleted
namespace "namespace-1578966507-25145" deleted
Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
has:namespace "my-namespace" deleted
core.sh:1335: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"other\" }}found{{end}}{{end}}:: :
(Bnamespace/other created
core.sh:1339: Successful get namespaces/other {{.metadata.name}}: other
(Bcore.sh:1343: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 01:48:45.249627   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/valid-pod created
E0114 01:48:45.355263   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1347: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BE0114 01:48:45.462811   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1349: Successful get pods -n other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BE0114 01:48:45.580732   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: a resource cannot be retrieved by name across all namespaces
has:a resource cannot be retrieved by name across all namespaces
core.sh:1356: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
core.sh:1360: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
(Bnamespace "other" deleted
E0114 01:48:46.250992   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:46.356646   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:46.465888   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:46.581807   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:47.252306   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:47.357936   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:47.467363   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:47.583131   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 01:48:47.667024   54459 horizontal.go:353] Horizontal Pod Autoscaler busybox0 has been deleted in namespace-1578966507-25145
I0114 01:48:47.670240   54459 horizontal.go:353] Horizontal Pod Autoscaler busybox1 has been deleted in namespace-1578966507-25145
E0114 01:48:48.253589   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:48.359264   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:48.468617   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:48.584034   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:49.255045   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:49.360122   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:49.469811   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:49.585285   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:50.257362   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:50.364325   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:50.470704   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:50.586410   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_secrets_test
Running command: run_secrets_test

+++ Running case: test-cmd.run_secrets_test 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_secrets_test
+++ [0114 01:48:51] Creating namespace namespace-1578966531-24661
E0114 01:48:51.258865   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578966531-24661 created
E0114 01:48:51.365639   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0114 01:48:51] Testing secrets
I0114 01:48:51.443998   71475 loader.go:375] Config loaded from file:  /tmp/tmp.PaCMxnunqg/.kube/config
Successful
message:apiVersion: v1
data:
... skipping 27 lines ...
  key1: dmFsdWUx
kind: Secret
metadata:
  creationTimestamp: null
  name: test
has not:example.com
E0114 01:48:51.471998   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:725: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-secrets\" }}found{{end}}{{end}}:: :
(BE0114 01:48:51.587657   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/test-secrets created
core.sh:729: Successful get namespaces/test-secrets {{.metadata.name}}: test-secrets
(Bcore.sh:733: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(Bsecret/test-secret created
core.sh:737: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(Bcore.sh:738: Successful get secret/test-secret --namespace=test-secrets {{.type}}: test-type
(BE0114 01:48:52.260115   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret "test-secret" deleted
E0114 01:48:52.366917   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:748: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 01:48:52.473342   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret/test-secret created
core.sh:752: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(BE0114 01:48:52.588813   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:753: Successful get secret/test-secret --namespace=test-secrets {{.type}}: kubernetes.io/dockerconfigjson
(Bsecret "test-secret" deleted
core.sh:763: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(Bsecret/test-secret created
core.sh:766: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(Bcore.sh:767: Successful get secret/test-secret --namespace=test-secrets {{.type}}: kubernetes.io/tls
(BE0114 01:48:53.261389   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret "test-secret" deleted
E0114 01:48:53.368124   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret/test-secret created
E0114 01:48:53.474595   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:773: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(BE0114 01:48:53.590010   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:774: Successful get secret/test-secret --namespace=test-secrets {{.type}}: kubernetes.io/tls
(Bsecret "test-secret" deleted
secret/secret-string-data created
core.sh:796: Successful get secret/secret-string-data --namespace=test-secrets  {{.data}}: map[k1:djE= k2:djI=]
(Bcore.sh:797: Successful get secret/secret-string-data --namespace=test-secrets  {{.data}}: map[k1:djE= k2:djI=]
(Bcore.sh:798: Successful get secret/secret-string-data --namespace=test-secrets  {{.stringData}}: <no value>
(BE0114 01:48:54.262798   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret "secret-string-data" deleted
I0114 01:48:54.348934   54459 namespace_controller.go:185] Namespace has been deleted my-namespace
E0114 01:48:54.369513   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:807: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 01:48:54.475776   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret "test-secret" deleted
E0114 01:48:54.591377   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace "test-secrets" deleted
I0114 01:48:54.816002   54459 namespace_controller.go:185] Namespace has been deleted kube-node-lease
I0114 01:48:54.837708   54459 namespace_controller.go:185] Namespace has been deleted namespace-1578966371-26714
I0114 01:48:54.844908   54459 namespace_controller.go:185] Namespace has been deleted namespace-1578966374-4328
I0114 01:48:54.847126   54459 namespace_controller.go:185] Namespace has been deleted namespace-1578966393-1340
I0114 01:48:54.856105   54459 namespace_controller.go:185] Namespace has been deleted namespace-1578966394-18882
... skipping 9 lines ...
I0114 01:48:55.121014   54459 namespace_controller.go:185] Namespace has been deleted namespace-1578966426-22990
I0114 01:48:55.123875   54459 namespace_controller.go:185] Namespace has been deleted namespace-1578966421-8476
I0114 01:48:55.123925   54459 namespace_controller.go:185] Namespace has been deleted namespace-1578966426-5893
I0114 01:48:55.133884   54459 namespace_controller.go:185] Namespace has been deleted namespace-1578966423-23161
I0114 01:48:55.135969   54459 namespace_controller.go:185] Namespace has been deleted namespace-1578966422-14362
I0114 01:48:55.189879   54459 namespace_controller.go:185] Namespace has been deleted namespace-1578966430-17537
E0114 01:48:55.264153   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 01:48:55.277670   54459 namespace_controller.go:185] Namespace has been deleted namespace-1578966433-32417
I0114 01:48:55.313754   54459 namespace_controller.go:185] Namespace has been deleted namespace-1578966453-9499
I0114 01:48:55.317253   54459 namespace_controller.go:185] Namespace has been deleted namespace-1578966452-15405
I0114 01:48:55.345443   54459 namespace_controller.go:185] Namespace has been deleted namespace-1578966434-18133
I0114 01:48:55.347755   54459 namespace_controller.go:185] Namespace has been deleted namespace-1578966474-2835
I0114 01:48:55.349195   54459 namespace_controller.go:185] Namespace has been deleted namespace-1578966469-30497
I0114 01:48:55.354444   54459 namespace_controller.go:185] Namespace has been deleted namespace-1578966454-14347
I0114 01:48:55.363489   54459 namespace_controller.go:185] Namespace has been deleted namespace-1578966474-3858
I0114 01:48:55.363999   54459 namespace_controller.go:185] Namespace has been deleted namespace-1578966463-6972
E0114 01:48:55.370942   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 01:48:55.385968   54459 namespace_controller.go:185] Namespace has been deleted namespace-1578966470-20309
I0114 01:48:55.441618   54459 namespace_controller.go:185] Namespace has been deleted namespace-1578966475-13374
I0114 01:48:55.457162   54459 namespace_controller.go:185] Namespace has been deleted namespace-1578966477-5828
I0114 01:48:55.465268   54459 namespace_controller.go:185] Namespace has been deleted namespace-1578966479-20132
E0114 01:48:55.476937   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 01:48:55.502816   54459 namespace_controller.go:185] Namespace has been deleted namespace-1578966506-17220
I0114 01:48:55.541211   54459 namespace_controller.go:185] Namespace has been deleted namespace-1578966507-25145
E0114 01:48:55.592734   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 01:48:56.097511   54459 namespace_controller.go:185] Namespace has been deleted other
E0114 01:48:56.265440   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:56.372152   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:56.478145   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:56.593999   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:57.266921   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:57.373356   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:57.479676   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:57.595360   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:58.268198   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:58.374876   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:58.480977   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:58.596644   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:59.269416   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:59.376124   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:59.482393   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:48:59.598056   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_configmap_tests
Running command: run_configmap_tests

+++ Running case: test-cmd.run_configmap_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_configmap_tests
+++ [0114 01:48:59] Creating namespace namespace-1578966539-26086
namespace/namespace-1578966539-26086 created
Context "test" modified.
+++ [0114 01:49:00] Testing configmaps
configmap/test-configmap created
E0114 01:49:00.270652   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:28: Successful get configmap/test-configmap {{.metadata.name}}: test-configmap
(BE0114 01:49:00.377355   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
configmap "test-configmap" deleted
E0114 01:49:00.483604   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:33: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-configmaps\" }}found{{end}}{{end}}:: :
(BE0114 01:49:00.599341   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/test-configmaps created
core.sh:37: Successful get namespaces/test-configmaps {{.metadata.name}}: test-configmaps
(Bcore.sh:41: Successful get configmaps {{range.items}}{{ if eq .metadata.name \"test-configmap\" }}found{{end}}{{end}}:: :
(Bcore.sh:42: Successful get configmaps {{range.items}}{{ if eq .metadata.name \"test-binary-configmap\" }}found{{end}}{{end}}:: :
(Bconfigmap/test-configmap created
configmap/test-binary-configmap created
core.sh:48: Successful get configmap/test-configmap --namespace=test-configmaps {{.metadata.name}}: test-configmap
(BE0114 01:49:01.271893   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:49: Successful get configmap/test-binary-configmap --namespace=test-configmaps {{.metadata.name}}: test-binary-configmap
(BE0114 01:49:01.378744   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:01.484711   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
configmap "test-configmap" deleted
E0114 01:49:01.600535   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
configmap "test-binary-configmap" deleted
namespace "test-configmaps" deleted
E0114 01:49:02.273154   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:02.380100   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:02.485890   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:02.601827   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:03.274542   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:03.381383   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:03.487386   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:03.603423   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:04.275860   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:04.382752   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:04.488665   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:04.604690   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 01:49:04.727830   54459 namespace_controller.go:185] Namespace has been deleted test-secrets
E0114 01:49:05.277087   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:05.383920   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:05.489990   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:05.605898   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:06.278423   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:06.385134   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:06.491314   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:06.607362   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_client_config_tests
Running command: run_client_config_tests

+++ Running case: test-cmd.run_client_config_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_client_config_tests
+++ [0114 01:49:06] Creating namespace namespace-1578966546-20389
namespace/namespace-1578966546-20389 created
Context "test" modified.
+++ [0114 01:49:07] Testing client config
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
E0114 01:49:07.279696   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
E0114 01:49:07.386485   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Error in configuration: context was not found for specified context: missing-context
has:context was not found for specified context: missing-context
E0114 01:49:07.492903   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: no server found for cluster "missing-cluster"
has:no server found for cluster "missing-cluster"
Successful
message:error: auth info "missing-user" does not exist
has:auth info "missing-user" does not exist
E0114 01:49:07.608666   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: error loading config file "/tmp/newconfig.yaml": no kind "Config" is registered for version "v-1" in scheme "k8s.io/client-go/tools/clientcmd/api/latest/latest.go:50"
has:error loading config file
Successful
message:error: stat missing-config: no such file or directory
has:no such file or directory
+++ exit code: 0
Recording: run_service_accounts_tests
Running command: run_service_accounts_tests

+++ Running case: test-cmd.run_service_accounts_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_service_accounts_tests
+++ [0114 01:49:07] Creating namespace namespace-1578966547-5404
namespace/namespace-1578966547-5404 created
Context "test" modified.
+++ [0114 01:49:08] Testing service accounts
core.sh:828: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-service-accounts\" }}found{{end}}{{end}}:: :
(BE0114 01:49:08.280865   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/test-service-accounts created
E0114 01:49:08.387692   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:832: Successful get namespaces/test-service-accounts {{.metadata.name}}: test-service-accounts
(Bserviceaccount/test-service-account created
E0114 01:49:08.494224   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:838: Successful get serviceaccount/test-service-account --namespace=test-service-accounts {{.metadata.name}}: test-service-account
(BE0114 01:49:08.609934   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
serviceaccount "test-service-account" deleted
namespace "test-service-accounts" deleted
E0114 01:49:09.282227   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:09.388912   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:09.495824   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:09.611335   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:10.283736   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:10.390120   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:10.497052   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:10.612459   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:11.285005   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:11.391504   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:11.498430   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:11.613809   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 01:49:11.813887   54459 namespace_controller.go:185] Namespace has been deleted test-configmaps
E0114 01:49:12.286617   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:12.392698   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:12.499656   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:12.615160   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:13.287930   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:13.393977   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:13.500860   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:13.616284   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_job_tests
Running command: run_job_tests

+++ Running case: test-cmd.run_job_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_job_tests
+++ [0114 01:49:13] Creating namespace namespace-1578966553-25687
namespace/namespace-1578966553-25687 created
Context "test" modified.
+++ [0114 01:49:14] Testing job
batch.sh:30: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-jobs\" }}found{{end}}{{end}}:: :
(Bnamespace/test-jobs created
E0114 01:49:14.289041   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
batch.sh:34: Successful get namespaces/test-jobs {{.metadata.name}}: test-jobs
(BE0114 01:49:14.395253   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
kubectl run --generator=cronjob/v1beta1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
cronjob.batch/pi created
E0114 01:49:14.502003   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
batch.sh:39: Successful get cronjob/pi --namespace=test-jobs {{.metadata.name}}: pi
(BE0114 01:49:14.617537   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
NAME   SCHEDULE       SUSPEND   ACTIVE   LAST SCHEDULE   AGE
pi     59 23 31 2 *   False     0        <none>          0s
Name:                          pi
Namespace:                     test-jobs
Labels:                        run=pi
Annotations:                   <none>
Schedule:                      59 23 31 2 *
Concurrency Policy:            Allow
Suspend:                       False
Successful Job History Limit:  3
Failed Job History Limit:      1
Starting Deadline Seconds:     <unset>
Selector:                      <unset>
Parallelism:                   <unset>
Completions:                   <unset>
Pod Template:
  Labels:  run=pi
... skipping 33 lines ...
                run=pi
Annotations:    cronjob.kubernetes.io/instantiate: manual
Controlled By:  CronJob/pi
Parallelism:    1
Completions:    1
Start Time:     Tue, 14 Jan 2020 01:49:15 +0000
Pods Statuses:  1 Running / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  controller-uid=374b7c85-55ea-44a1-bdde-9425fa844a3c
           job-name=test-job
           run=pi
  Containers:
   pi:
... skipping 12 lines ...
    Mounts:       <none>
  Volumes:        <none>
Events:
  Type    Reason            Age   From            Message
  ----    ------            ----  ----            -------
  Normal  SuccessfulCreate  1s    job-controller  Created pod: test-job-f88j9
E0114 01:49:15.290189   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
job.batch "test-job" deleted
E0114 01:49:15.396409   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
cronjob.batch "pi" deleted
E0114 01:49:15.503138   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace "test-jobs" deleted
E0114 01:49:15.618483   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:16.291577   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:16.397649   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:16.504329   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:16.619888   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:17.292855   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:17.399066   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:17.505610   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:17.621074   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:18.294026   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:18.400411   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:18.507164   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:18.622411   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 01:49:18.831327   54459 namespace_controller.go:185] Namespace has been deleted test-service-accounts
E0114 01:49:19.295298   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:19.401742   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:19.508478   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:19.623653   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:20.296572   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:20.403255   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:20.509768   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:20.624450   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_create_job_tests
Running command: run_create_job_tests

+++ Running case: test-cmd.run_create_job_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
... skipping 4 lines ...
I0114 01:49:20.965323   54459 event.go:278] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1578966560-29850", Name:"test-job", UID:"ebb1568d-198a-46ec-ba34-d5ec177e1606", APIVersion:"batch/v1", ResourceVersion:"1515", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-job-jvxx9
job.batch/test-job created
create.sh:86: Successful get job test-job {{(index .spec.template.spec.containers 0).image}}: k8s.gcr.io/nginx:test-cmd
(Bjob.batch "test-job" deleted
I0114 01:49:21.242568   54459 event.go:278] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1578966560-29850", Name:"test-job-pi", UID:"17c07017-ce58-46fb-a3ac-db5c683eb2cf", APIVersion:"batch/v1", ResourceVersion:"1522", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-job-pi-28r5m
job.batch/test-job-pi created
E0114 01:49:21.297661   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
create.sh:92: Successful get job test-job-pi {{(index .spec.template.spec.containers 0).image}}: k8s.gcr.io/perl
(BE0114 01:49:21.404512   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
job.batch "test-job-pi" deleted
kubectl run --generator=cronjob/v1beta1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
E0114 01:49:21.511089   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
cronjob.batch/test-pi created
I0114 01:49:21.613405   54459 event.go:278] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1578966560-29850", Name:"my-pi", UID:"711d90a6-d15b-4c0f-aa79-87224429f0d7", APIVersion:"batch/v1", ResourceVersion:"1530", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: my-pi-n9sgk
job.batch/my-pi created
E0114 01:49:21.625303   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:[perl -Mbignum=bpi -wle print bpi(10)]
has:perl -Mbignum=bpi -wle print bpi(10)
job.batch "my-pi" deleted
cronjob.batch "test-pi" deleted
+++ exit code: 0
... skipping 5 lines ...
+++ command: run_pod_templates_tests
+++ [0114 01:49:21] Creating namespace namespace-1578966561-24965
namespace/namespace-1578966561-24965 created
Context "test" modified.
+++ [0114 01:49:22] Testing pod templates
core.sh:1421: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 01:49:22.298955   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:22.405816   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 01:49:22.413779   50985 controller.go:606] quota admission added evaluator for: podtemplates
podtemplate/nginx created
E0114 01:49:22.512251   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1425: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(BNAME    CONTAINERS   IMAGES   POD LABELS
nginx   nginx        nginx    name=nginx
E0114 01:49:22.626506   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1433: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(Bpodtemplate "nginx" deleted
core.sh:1437: Successful get podtemplate {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
Recording: run_service_tests
Running command: run_service_tests

+++ Running case: test-cmd.run_service_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_service_tests
Context "test" modified.
+++ [0114 01:49:23] Testing kubectl(v1:services)
core.sh:858: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BE0114 01:49:23.300503   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:23.407196   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/redis-master created
E0114 01:49:23.513423   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:862: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(BE0114 01:49:23.627417   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Labels:
matched Selector:
matched IP:
matched Port:
matched Endpoints:
... skipping 124 lines ...
IP:                10.0.0.31
Port:              <unset>  6379/TCP
TargetPort:        6379/TCP
Endpoints:         <none>
Session Affinity:  None
Events:            <none>
(BE0114 01:49:24.301705   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:              kubernetes
Namespace:         default
Labels:            component=apiserver
                   provider=kubernetes
Annotations:       <none>
... skipping 16 lines ...
Type:              ClusterIP
IP:                10.0.0.31
Port:              <unset>  6379/TCP
TargetPort:        6379/TCP
Endpoints:         <none>
Session Affinity:  None
(BE0114 01:49:24.408231   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:              kubernetes
Namespace:         default
Labels:            component=apiserver
                   provider=kubernetes
Annotations:       <none>
... skipping 19 lines ...
Port:              <unset>  6379/TCP
TargetPort:        6379/TCP
Endpoints:         <none>
Session Affinity:  None
Events:            <none>
(Bcore.sh:882: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend:
(BE0114 01:49:24.514625   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apiVersion: v1
kind: Service
metadata:
  creationTimestamp: null
  labels:
    app: redis
... skipping 5 lines ...
  - port: 6379
    targetPort: 6379
  selector:
    role: padawan
status:
  loadBalancer: {}
E0114 01:49:24.628547   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apiVersion: v1
kind: Service
metadata:
  creationTimestamp: "2020-01-14T01:49:23Z"
  labels:
    app: redis
... skipping 42 lines ...
  selector:
    role: padawan
  sessionAffinity: None
  type: ClusterIP
status:
  loadBalancer: {}
error: you must specify resources by --filename when --local is set.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
core.sh:898: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend:
(BE0114 01:49:25.302928   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:25.409555   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/redis-master selector updated
E0114 01:49:25.515894   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Error from server (Conflict): Operation cannot be fulfilled on services "redis-master": the object has been modified; please apply your changes to the latest version and try again
has:Conflict
I0114 01:49:25.617716   54459 namespace_controller.go:185] Namespace has been deleted test-jobs
E0114 01:49:25.629664   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:911: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(Bservice "redis-master" deleted
core.sh:918: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bcore.sh:922: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bservice/redis-master created
E0114 01:49:26.304378   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:926: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(BE0114 01:49:26.410911   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:930: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(BE0114 01:49:26.517183   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:26.630892   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/service-v1-test created
core.sh:951: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:service-v1-test:
(Bservice/service-v1-test replaced
core.sh:958: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:service-v1-test:
(Bservice "redis-master" deleted
service "service-v1-test" deleted
E0114 01:49:27.305291   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:27.412355   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:966: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BE0114 01:49:27.518397   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:970: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BE0114 01:49:27.632004   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/redis-master created
service/redis-slave created
core.sh:975: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:redis-slave:
(BSuccessful
message:NAME           RSRC
kubernetes     144
redis-master   1572
redis-slave    1575
has:redis-master
core.sh:985: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:redis-slave:
(BE0114 01:49:28.306690   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "redis-master" deleted
service "redis-slave" deleted
E0114 01:49:28.413603   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:992: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BE0114 01:49:28.519552   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:996: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BE0114 01:49:28.633196   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/beep-boop created
core.sh:1000: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: beep-boop:kubernetes:
(Bcore.sh:1004: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: beep-boop:kubernetes:
(Bservice "beep-boop" deleted
core.sh:1011: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bcore.sh:1015: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bkubectl run --generator=deployment/apps.v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
I0114 01:49:29.228923   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"default", Name:"testmetadata", UID:"9e971f64-60ac-4631-a14b-bd52f6e688ce", APIVersion:"apps/v1", ResourceVersion:"1589", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set testmetadata-bd968f46 to 2
I0114 01:49:29.235032   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"default", Name:"testmetadata-bd968f46", UID:"d30da9eb-f599-4432-8f34-2deecf6e9ca5", APIVersion:"apps/v1", ResourceVersion:"1591", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: testmetadata-bd968f46-6fbjv
I0114 01:49:29.237247   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"default", Name:"testmetadata-bd968f46", UID:"d30da9eb-f599-4432-8f34-2deecf6e9ca5", APIVersion:"apps/v1", ResourceVersion:"1591", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: testmetadata-bd968f46-n68qz
service/testmetadata created
deployment.apps/testmetadata created
E0114 01:49:29.307878   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1019: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: testmetadata:
(BE0114 01:49:29.415160   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1020: Successful get service testmetadata {{.metadata.annotations}}: map[zone-context:home]
(BE0114 01:49:29.520838   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/exposemetadata exposed
E0114 01:49:29.634403   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1026: Successful get service exposemetadata {{.metadata.annotations}}: map[zone-context:work]
(Bservice "exposemetadata" deleted
service "testmetadata" deleted
deployment.apps "testmetadata" deleted
+++ exit code: 0
Recording: run_daemonset_tests
... skipping 4 lines ...
+++ command: run_daemonset_tests
+++ [0114 01:49:29] Creating namespace namespace-1578966569-14874
namespace/namespace-1578966569-14874 created
Context "test" modified.
+++ [0114 01:49:30] Testing kubectl(v1:daemonsets)
apps.sh:30: Successful get daemonsets {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 01:49:30.309015   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 01:49:30.398612   50985 controller.go:606] quota admission added evaluator for: daemonsets.apps
daemonset.apps/bind created
I0114 01:49:30.408436   50985 controller.go:606] quota admission added evaluator for: controllerrevisions.apps
E0114 01:49:30.415942   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:34: Successful get daemonsets bind {{.metadata.generation}}: 1
(BE0114 01:49:30.521918   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:30.635667   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind configured
apps.sh:37: Successful get daemonsets bind {{.metadata.generation}}: 1
(Bdaemonset.apps/bind image updated
apps.sh:40: Successful get daemonsets bind {{.metadata.generation}}: 2
(Bdaemonset.apps/bind env updated
apps.sh:42: Successful get daemonsets bind {{.metadata.generation}}: 3
(BE0114 01:49:31.310471   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind resource requirements updated
E0114 01:49:31.417072   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:44: Successful get daemonsets bind {{.metadata.generation}}: 4
(Bdaemonset.apps/bind restarted
E0114 01:49:31.522911   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:48: Successful get daemonsets bind {{.metadata.generation}}: 5
(BE0114 01:49:31.637034   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps "bind" deleted
+++ exit code: 0
Recording: run_daemonset_history_tests
Running command: run_daemonset_history_tests

+++ Running case: test-cmd.run_daemonset_history_tests 
... skipping 2 lines ...
+++ [0114 01:49:31] Creating namespace namespace-1578966571-16050
namespace/namespace-1578966571-16050 created
Context "test" modified.
+++ [0114 01:49:31] Testing kubectl(v1:daemonsets, v1:controllerrevisions)
apps.sh:66: Successful get daemonsets {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdaemonset.apps/bind created
E0114 01:49:32.311672   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:70: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[deprecated.daemonset.template.generation:1 kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"DaemonSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"service":"bind"},"name":"bind","namespace":"namespace-1578966571-16050"},"spec":{"selector":{"matchLabels":{"service":"bind"}},"template":{"metadata":{"labels":{"service":"bind"}},"spec":{"affinity":{"podAntiAffinity":{"requiredDuringSchedulingIgnoredDuringExecution":[{"labelSelector":{"matchExpressions":[{"key":"service","operator":"In","values":["bind"]}]},"namespaces":[],"topologyKey":"kubernetes.io/hostname"}]}},"containers":[{"image":"k8s.gcr.io/pause:2.0","name":"kubernetes-pause"}]}},"updateStrategy":{"rollingUpdate":{"maxUnavailable":"10%"},"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
(BE0114 01:49:32.418109   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind skipped rollback (current template already matches revision 1)
E0114 01:49:32.524018   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:73: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(BE0114 01:49:32.638126   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:74: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(Bdaemonset.apps/bind configured
apps.sh:77: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
(Bapps.sh:78: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:79: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bapps.sh:80: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[deprecated.daemonset.template.generation:1 kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"DaemonSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"service":"bind"},"name":"bind","namespace":"namespace-1578966571-16050"},"spec":{"selector":{"matchLabels":{"service":"bind"}},"template":{"metadata":{"labels":{"service":"bind"}},"spec":{"affinity":{"podAntiAffinity":{"requiredDuringSchedulingIgnoredDuringExecution":[{"labelSelector":{"matchExpressions":[{"key":"service","operator":"In","values":["bind"]}]},"namespaces":[],"topologyKey":"kubernetes.io/hostname"}]}},"containers":[{"image":"k8s.gcr.io/pause:2.0","name":"kubernetes-pause"}]}},"updateStrategy":{"rollingUpdate":{"maxUnavailable":"10%"},"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:map[deprecated.daemonset.template.generation:2 kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"DaemonSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"service":"bind"},"name":"bind","namespace":"namespace-1578966571-16050"},"spec":{"selector":{"matchLabels":{"service":"bind"}},"template":{"metadata":{"labels":{"service":"bind"}},"spec":{"affinity":{"podAntiAffinity":{"requiredDuringSchedulingIgnoredDuringExecution":[{"labelSelector":{"matchExpressions":[{"key":"service","operator":"In","values":["bind"]}]},"namespaces":[],"topologyKey":"kubernetes.io/hostname"}]}},"containers":[{"image":"k8s.gcr.io/pause:latest","name":"kubernetes-pause"},{"image":"k8s.gcr.io/nginx:test-cmd","name":"app"}]}},"updateStrategy":{"rollingUpdate":{"maxUnavailable":"10%"},"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
(BE0114 01:49:33.312933   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind will roll back to Pod Template:
  Labels:	service=bind
  Containers:
   kubernetes-pause:
    Image:	k8s.gcr.io/pause:2.0
    Port:	<none>
    Host Port:	<none>
    Environment:	<none>
    Mounts:	<none>
  Volumes:	<none>
 (dry run)
E0114 01:49:33.419446   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:83: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
(BE0114 01:49:33.525218   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:84: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0114 01:49:33.639418   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:85: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bdaemonset.apps/bind rolled back
E0114 01:49:33.774576   54459 daemon_controller.go:291] namespace-1578966571-16050/bind failed with : error storing status for daemon set &v1.DaemonSet{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"bind", GenerateName:"", Namespace:"namespace-1578966571-16050", SelfLink:"/apis/apps/v1/namespaces/namespace-1578966571-16050/daemonsets/bind", UID:"c06277a5-fbd4-4279-9a4e-72163d289a3f", ResourceVersion:"1658", Generation:3, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63714563372, loc:(*time.Location)(0x6b26ba0)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string{"deprecated.daemonset.template.generation":"3", "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"DaemonSet\",\"metadata\":{\"annotations\":{\"kubernetes.io/change-cause\":\"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true\"},\"labels\":{\"service\":\"bind\"},\"name\":\"bind\",\"namespace\":\"namespace-1578966571-16050\"},\"spec\":{\"selector\":{\"matchLabels\":{\"service\":\"bind\"}},\"template\":{\"metadata\":{\"labels\":{\"service\":\"bind\"}},\"spec\":{\"affinity\":{\"podAntiAffinity\":{\"requiredDuringSchedulingIgnoredDuringExecution\":[{\"labelSelector\":{\"matchExpressions\":[{\"key\":\"service\",\"operator\":\"In\",\"values\":[\"bind\"]}]},\"namespaces\":[],\"topologyKey\":\"kubernetes.io/hostname\"}]}},\"containers\":[{\"image\":\"k8s.gcr.io/pause:latest\",\"name\":\"kubernetes-pause\"},{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"app\"}]}},\"updateStrategy\":{\"rollingUpdate\":{\"maxUnavailable\":\"10%\"},\"type\":\"RollingUpdate\"}}}\n", "kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"}, OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"kube-controller-manager", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc001f86200), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc001f86220)}, v1.ManagedFieldsEntry{Manager:"kubectl", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc001f86240), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc001f86260)}}}, Spec:v1.DaemonSetSpec{Selector:(*v1.LabelSelector)(0xc001f86280), Template:v1.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"kubernetes-pause", Image:"k8s.gcr.io/pause:2.0", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc001f7fff8), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc002bca2a0), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(0xc001f862c0), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration(nil), HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(nil), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}}, UpdateStrategy:v1.DaemonSetUpdateStrategy{Type:"RollingUpdate", RollingUpdate:(*v1.RollingUpdateDaemonSet)(0xc001e460e0)}, MinReadySeconds:0, RevisionHistoryLimit:(*int32)(0xc0029fe04c)}, Status:v1.DaemonSetStatus{CurrentNumberScheduled:0, NumberMisscheduled:0, DesiredNumberScheduled:0, NumberReady:0, ObservedGeneration:2, UpdatedNumberScheduled:0, NumberAvailable:0, NumberUnavailable:0, CollisionCount:(*int32)(nil), Conditions:[]v1.DaemonSetCondition(nil)}}: Operation cannot be fulfilled on daemonsets.apps "bind": the object has been modified; please apply your changes to the latest version and try again
apps.sh:88: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:89: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BSuccessful
message:error: unable to find specified revision 1000000 in history
has:unable to find specified revision
apps.sh:93: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:94: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BE0114 01:49:34.314192   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind rolled back
E0114 01:49:34.387201   54459 daemon_controller.go:291] namespace-1578966571-16050/bind failed with : error storing status for daemon set &v1.DaemonSet{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"bind", GenerateName:"", Namespace:"namespace-1578966571-16050", SelfLink:"/apis/apps/v1/namespaces/namespace-1578966571-16050/daemonsets/bind", UID:"c06277a5-fbd4-4279-9a4e-72163d289a3f", ResourceVersion:"1661", Generation:4, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63714563372, loc:(*time.Location)(0x6b26ba0)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string{"deprecated.daemonset.template.generation":"4", "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"DaemonSet\",\"metadata\":{\"annotations\":{\"kubernetes.io/change-cause\":\"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true\"},\"labels\":{\"service\":\"bind\"},\"name\":\"bind\",\"namespace\":\"namespace-1578966571-16050\"},\"spec\":{\"selector\":{\"matchLabels\":{\"service\":\"bind\"}},\"template\":{\"metadata\":{\"labels\":{\"service\":\"bind\"}},\"spec\":{\"affinity\":{\"podAntiAffinity\":{\"requiredDuringSchedulingIgnoredDuringExecution\":[{\"labelSelector\":{\"matchExpressions\":[{\"key\":\"service\",\"operator\":\"In\",\"values\":[\"bind\"]}]},\"namespaces\":[],\"topologyKey\":\"kubernetes.io/hostname\"}]}},\"containers\":[{\"image\":\"k8s.gcr.io/pause:latest\",\"name\":\"kubernetes-pause\"},{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"app\"}]}},\"updateStrategy\":{\"rollingUpdate\":{\"maxUnavailable\":\"10%\"},\"type\":\"RollingUpdate\"}}}\n", "kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"}, OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"kube-controller-manager", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc000c349a0), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc000c349c0)}, v1.ManagedFieldsEntry{Manager:"kubectl", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc000c349e0), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc000c34a00)}}}, Spec:v1.DaemonSetSpec{Selector:(*v1.LabelSelector)(0xc000c34a40), Template:v1.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"kubernetes-pause", Image:"k8s.gcr.io/pause:latest", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}, v1.Container{Name:"app", Image:"k8s.gcr.io/nginx:test-cmd", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc00262e938), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc000ddaf00), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(0xc000c34a80), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration(nil), HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(nil), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}}, UpdateStrategy:v1.DaemonSetUpdateStrategy{Type:"RollingUpdate", RollingUpdate:(*v1.RollingUpdateDaemonSet)(0xc000da0258)}, MinReadySeconds:0, RevisionHistoryLimit:(*int32)(0xc00262e98c)}, Status:v1.DaemonSetStatus{CurrentNumberScheduled:0, NumberMisscheduled:0, DesiredNumberScheduled:0, NumberReady:0, ObservedGeneration:3, UpdatedNumberScheduled:0, NumberAvailable:0, NumberUnavailable:0, CollisionCount:(*int32)(nil), Conditions:[]v1.DaemonSetCondition(nil)}}: Operation cannot be fulfilled on daemonsets.apps "bind": the object has been modified; please apply your changes to the latest version and try again
E0114 01:49:34.420537   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:97: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
(BE0114 01:49:34.526511   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:98: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0114 01:49:34.640627   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:99: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bdaemonset.apps "bind" deleted
+++ exit code: 0
Recording: run_rc_tests
Running command: run_rc_tests

... skipping 3 lines ...
+++ [0114 01:49:34] Creating namespace namespace-1578966574-26266
namespace/namespace-1578966574-26266 created
Context "test" modified.
+++ [0114 01:49:35] Testing kubectl(v1:replicationcontrollers)
core.sh:1052: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/frontend created
E0114 01:49:35.315256   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 01:49:35.321602   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966574-26266", Name:"frontend", UID:"04c0ebc4-0d59-490f-a91d-96b7cc52285d", APIVersion:"v1", ResourceVersion:"1671", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-qsz5w
I0114 01:49:35.332802   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966574-26266", Name:"frontend", UID:"04c0ebc4-0d59-490f-a91d-96b7cc52285d", APIVersion:"v1", ResourceVersion:"1671", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-bzn72
I0114 01:49:35.332848   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966574-26266", Name:"frontend", UID:"04c0ebc4-0d59-490f-a91d-96b7cc52285d", APIVersion:"v1", ResourceVersion:"1671", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-jqv9h
replicationcontroller "frontend" deleted
E0114 01:49:35.421636   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1057: Successful get pods -l "name=frontend" {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 01:49:35.527781   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1061: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 01:49:35.641925   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend created
I0114 01:49:35.791353   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966574-26266", Name:"frontend", UID:"ff598360-3b7a-46ec-b29a-22c11568a494", APIVersion:"v1", ResourceVersion:"1687", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-6xnj4
I0114 01:49:35.794717   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966574-26266", Name:"frontend", UID:"ff598360-3b7a-46ec-b29a-22c11568a494", APIVersion:"v1", ResourceVersion:"1687", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-mhs69
I0114 01:49:35.794780   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966574-26266", Name:"frontend", UID:"ff598360-3b7a-46ec-b29a-22c11568a494", APIVersion:"v1", ResourceVersion:"1687", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-rnz7f
core.sh:1065: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Bmatched Name:
... skipping 9 lines ...
Namespace:    namespace-1578966574-26266
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
Namespace:    namespace-1578966574-26266
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 18 lines ...
Namespace:    namespace-1578966574-26266
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 4 lines ...
      memory:  100Mi
    Environment:
      GET_HOSTS_FROM:  dns
    Mounts:            <none>
  Volumes:             <none>
(B
E0114 01:49:36.316980   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1073: Successful describe
Name:         frontend
Namespace:    namespace-1578966574-26266
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 10 lines ...
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-6xnj4
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-mhs69
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-rnz7f
(B
E0114 01:49:36.422934   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Name:
matched Pod Template:
matched Labels:
matched Selector:
matched Replicas:
... skipping 5 lines ...
Namespace:    namespace-1578966574-26266
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 9 lines ...
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-6xnj4
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-mhs69
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-rnz7f
(BE0114 01:49:36.529011   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:         frontend
Namespace:    namespace-1578966574-26266
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 9 lines ...
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-6xnj4
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-mhs69
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-rnz7f
(BE0114 01:49:36.643305   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:         frontend
Namespace:    namespace-1578966574-26266
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 11 lines ...
Namespace:    namespace-1578966574-26266
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 15 lines ...
(Bcore.sh:1085: Successful get rc frontend {{.spec.replicas}}: 3
(Breplicationcontroller/frontend scaled
E0114 01:49:37.057060   54459 replica_set.go:199] ReplicaSet has no controller: &ReplicaSet{ObjectMeta:{frontend  namespace-1578966574-26266 /api/v1/namespaces/namespace-1578966574-26266/replicationcontrollers/frontend ff598360-3b7a-46ec-b29a-22c11568a494 1698 2 2020-01-14 01:49:35 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []},Spec:ReplicaSetSpec{Replicas:*2,Selector:&v1.LabelSelector{MatchLabels:map[string]string{app: guestbook,tier: frontend,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{      0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []} {[] [] [{php-redis gcr.io/google_samples/gb-frontend:v4 [] []  [{ 0 80 TCP }] [] [{GET_HOSTS_FROM dns nil}] {map[] map[cpu:{{100 -3} {<nil>} 100m DecimalSI} memory:{{104857600 0} {<nil>} 100Mi BinarySI}]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent nil false false false}] [] Always 0xc002f1dc68 <nil> ClusterFirst map[]   <nil>  false false false <nil> PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,} []   nil default-scheduler [] []  <nil> nil [] <nil> <nil> <nil> map[] []}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:3,FullyLabeledReplicas:3,ObservedGeneration:1,ReadyReplicas:0,AvailableReplicas:0,Conditions:[]ReplicaSetCondition{},},}
I0114 01:49:37.066419   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966574-26266", Name:"frontend", UID:"ff598360-3b7a-46ec-b29a-22c11568a494", APIVersion:"v1", ResourceVersion:"1698", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-6xnj4
core.sh:1089: Successful get rc frontend {{.spec.replicas}}: 2
(Bcore.sh:1093: Successful get rc frontend {{.spec.replicas}}: 2
(BE0114 01:49:37.318206   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
error: Expected replicas to be 3, was 2
E0114 01:49:37.424157   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1097: Successful get rc frontend {{.spec.replicas}}: 2
(BE0114 01:49:37.531453   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1101: Successful get rc frontend {{.spec.replicas}}: 2
(BE0114 01:49:37.644584   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend scaled
I0114 01:49:37.689558   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966574-26266", Name:"frontend", UID:"ff598360-3b7a-46ec-b29a-22c11568a494", APIVersion:"v1", ResourceVersion:"1704", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-wm9nm
core.sh:1105: Successful get rc frontend {{.spec.replicas}}: 3
(Bcore.sh:1109: Successful get rc frontend {{.spec.replicas}}: 3
(BE0114 01:49:38.011765   54459 replica_set.go:199] ReplicaSet has no controller: &ReplicaSet{ObjectMeta:{frontend  namespace-1578966574-26266 /api/v1/namespaces/namespace-1578966574-26266/replicationcontrollers/frontend ff598360-3b7a-46ec-b29a-22c11568a494 1709 4 2020-01-14 01:49:35 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []},Spec:ReplicaSetSpec{Replicas:*2,Selector:&v1.LabelSelector{MatchLabels:map[string]string{app: guestbook,tier: frontend,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{      0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []} {[] [] [{php-redis gcr.io/google_samples/gb-frontend:v4 [] []  [{ 0 80 TCP }] [] [{GET_HOSTS_FROM dns nil}] {map[] map[cpu:{{100 -3} {<nil>} 100m DecimalSI} memory:{{104857600 0} {<nil>} 100Mi BinarySI}]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent nil false false false}] [] Always 0xc0028332e8 <nil> ClusterFirst map[]   <nil>  false false false <nil> PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,} []   nil default-scheduler [] []  <nil> nil [] <nil> <nil> <nil> map[] []}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:3,FullyLabeledReplicas:3,ObservedGeneration:3,ReadyReplicas:0,AvailableReplicas:0,Conditions:[]ReplicaSetCondition{},},}
replicationcontroller/frontend scaled
I0114 01:49:38.020490   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966574-26266", Name:"frontend", UID:"ff598360-3b7a-46ec-b29a-22c11568a494", APIVersion:"v1", ResourceVersion:"1709", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-wm9nm
core.sh:1113: Successful get rc frontend {{.spec.replicas}}: 2
(Breplicationcontroller "frontend" deleted
E0114 01:49:38.319382   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/redis-master created
I0114 01:49:38.393426   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966574-26266", Name:"redis-master", UID:"cbe2816e-f55a-4e6b-827e-f6c2b4e6c897", APIVersion:"v1", ResourceVersion:"1720", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-29ffh
E0114 01:49:38.425254   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:38.532692   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/redis-slave created
I0114 01:49:38.584979   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966574-26266", Name:"redis-slave", UID:"c7c60e75-0218-4f71-bbe3-a55d598da82d", APIVersion:"v1", ResourceVersion:"1725", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-pkx7k
I0114 01:49:38.590914   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966574-26266", Name:"redis-slave", UID:"c7c60e75-0218-4f71-bbe3-a55d598da82d", APIVersion:"v1", ResourceVersion:"1725", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-q2mg6
E0114 01:49:38.645791   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/redis-master scaled
I0114 01:49:38.687119   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966574-26266", Name:"redis-master", UID:"cbe2816e-f55a-4e6b-827e-f6c2b4e6c897", APIVersion:"v1", ResourceVersion:"1732", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-cj9sz
replicationcontroller/redis-slave scaled
I0114 01:49:38.690563   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966574-26266", Name:"redis-master", UID:"cbe2816e-f55a-4e6b-827e-f6c2b4e6c897", APIVersion:"v1", ResourceVersion:"1732", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-xdf2b
I0114 01:49:38.691169   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966574-26266", Name:"redis-master", UID:"cbe2816e-f55a-4e6b-827e-f6c2b4e6c897", APIVersion:"v1", ResourceVersion:"1732", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-krc44
I0114 01:49:38.696127   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966574-26266", Name:"redis-slave", UID:"c7c60e75-0218-4f71-bbe3-a55d598da82d", APIVersion:"v1", ResourceVersion:"1734", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-4h2f7
... skipping 6 lines ...
I0114 01:49:39.177102   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966574-26266", Name:"nginx-deployment", UID:"a4dc2556-c8b9-437f-a32d-016720a0856a", APIVersion:"apps/v1", ResourceVersion:"1768", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
I0114 01:49:39.179680   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966574-26266", Name:"nginx-deployment-6986c7bc94", UID:"0f0b3a26-9825-4cd8-b12f-0036d52d83b4", APIVersion:"apps/v1", ResourceVersion:"1769", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-g8hvm
I0114 01:49:39.183637   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966574-26266", Name:"nginx-deployment-6986c7bc94", UID:"0f0b3a26-9825-4cd8-b12f-0036d52d83b4", APIVersion:"apps/v1", ResourceVersion:"1769", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-xpdph
I0114 01:49:39.184346   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966574-26266", Name:"nginx-deployment-6986c7bc94", UID:"0f0b3a26-9825-4cd8-b12f-0036d52d83b4", APIVersion:"apps/v1", ResourceVersion:"1769", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-mfs5t
deployment.apps/nginx-deployment scaled
I0114 01:49:39.305742   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966574-26266", Name:"nginx-deployment", UID:"a4dc2556-c8b9-437f-a32d-016720a0856a", APIVersion:"apps/v1", ResourceVersion:"1783", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-6986c7bc94 to 1
E0114 01:49:39.320649   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 01:49:39.322494   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966574-26266", Name:"nginx-deployment-6986c7bc94", UID:"0f0b3a26-9825-4cd8-b12f-0036d52d83b4", APIVersion:"apps/v1", ResourceVersion:"1784", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-6986c7bc94-g8hvm
I0114 01:49:39.323554   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966574-26266", Name:"nginx-deployment-6986c7bc94", UID:"0f0b3a26-9825-4cd8-b12f-0036d52d83b4", APIVersion:"apps/v1", ResourceVersion:"1784", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-6986c7bc94-xpdph
core.sh:1133: Successful get deployment nginx-deployment {{.spec.replicas}}: 1
(BE0114 01:49:39.426523   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx-deployment" deleted
E0114 01:49:39.533903   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:service/expose-test-deployment exposed
has:service/expose-test-deployment exposed
E0114 01:49:39.647160   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "expose-test-deployment" deleted
Successful
message:error: couldn't retrieve selectors via --selector flag or introspection: invalid deployment: no selectors, therefore cannot be exposed
See 'kubectl expose -h' for help and examples
has:invalid deployment: no selectors
deployment.apps/nginx-deployment created
I0114 01:49:39.981735   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966574-26266", Name:"nginx-deployment", UID:"33255c29-9332-4f17-aa12-0589382b9790", APIVersion:"apps/v1", ResourceVersion:"1807", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
I0114 01:49:39.985341   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966574-26266", Name:"nginx-deployment-6986c7bc94", UID:"2e159073-37e8-4aab-be8a-00e3ecbaffe2", APIVersion:"apps/v1", ResourceVersion:"1808", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-6b2tt
I0114 01:49:39.991113   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966574-26266", Name:"nginx-deployment-6986c7bc94", UID:"2e159073-37e8-4aab-be8a-00e3ecbaffe2", APIVersion:"apps/v1", ResourceVersion:"1808", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-vkg45
I0114 01:49:39.991204   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966574-26266", Name:"nginx-deployment-6986c7bc94", UID:"2e159073-37e8-4aab-be8a-00e3ecbaffe2", APIVersion:"apps/v1", ResourceVersion:"1808", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-vqnsk
core.sh:1152: Successful get deployment nginx-deployment {{.spec.replicas}}: 3
(Bservice/nginx-deployment exposed
core.sh:1156: Successful get service nginx-deployment {{(index .spec.ports 0).port}}: 80
(BE0114 01:49:40.321894   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx-deployment" deleted
service "nginx-deployment" deleted
E0114 01:49:40.427580   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:40.535254   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend created
I0114 01:49:40.557923   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966574-26266", Name:"frontend", UID:"a231bc35-a8f4-4ae9-87da-d0aea2e6b239", APIVersion:"v1", ResourceVersion:"1828", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-25fsf
I0114 01:49:40.561321   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966574-26266", Name:"frontend", UID:"a231bc35-a8f4-4ae9-87da-d0aea2e6b239", APIVersion:"v1", ResourceVersion:"1828", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-4zlfc
I0114 01:49:40.561904   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966574-26266", Name:"frontend", UID:"a231bc35-a8f4-4ae9-87da-d0aea2e6b239", APIVersion:"v1", ResourceVersion:"1828", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-f7jjf
E0114 01:49:40.648378   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1163: Successful get rc frontend {{.spec.replicas}}: 3
(Bservice/frontend exposed
core.sh:1167: Successful get service frontend {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(Bservice/frontend-2 exposed
core.sh:1171: Successful get service frontend-2 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 443
(Bpod/valid-pod created
E0114 01:49:41.323239   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/frontend-3 exposed
E0114 01:49:41.428732   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1176: Successful get service frontend-3 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 444
(Bservice/frontend-4 exposed
E0114 01:49:41.536129   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1180: Successful get service frontend-4 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: default 80
(BE0114 01:49:41.649643   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/frontend-5 exposed
core.sh:1184: Successful get service frontend-5 {{(index .spec.ports 0).port}}: 80
(Bpod "valid-pod" deleted
service "frontend" deleted
service "frontend-2" deleted
service "frontend-3" deleted
service "frontend-4" deleted
service "frontend-5" deleted
Successful
message:error: cannot expose a Node
has:cannot expose
Successful
message:The Service "invalid-large-service-name-that-has-more-than-sixty-three-characters" is invalid: metadata.name: Invalid value: "invalid-large-service-name-that-has-more-than-sixty-three-characters": must be no more than 63 characters
has:metadata.name: Invalid value
E0114 01:49:42.324401   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:service/kubernetes-serve-hostname-testing-sixty-three-characters-in-len exposed
has:kubernetes-serve-hostname-testing-sixty-three-characters-in-len exposed
service "kubernetes-serve-hostname-testing-sixty-three-characters-in-len" deleted
E0114 01:49:42.429682   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:service/etcd-server exposed
has:etcd-server exposed
E0114 01:49:42.537148   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1214: Successful get service etcd-server {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: port-1 2380
(BE0114 01:49:42.651016   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1215: Successful get service etcd-server {{(index .spec.ports 1).name}} {{(index .spec.ports 1).port}}: port-2 2379
(Bservice "etcd-server" deleted
core.sh:1221: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Breplicationcontroller "frontend" deleted
core.sh:1225: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Bcore.sh:1229: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 01:49:43.325746   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend created
I0114 01:49:43.387160   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966574-26266", Name:"frontend", UID:"17c1d209-8653-4cac-9680-0cf8d428f114", APIVersion:"v1", ResourceVersion:"1900", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-tmpwm
I0114 01:49:43.390128   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966574-26266", Name:"frontend", UID:"17c1d209-8653-4cac-9680-0cf8d428f114", APIVersion:"v1", ResourceVersion:"1900", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-s94nq
I0114 01:49:43.390183   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966574-26266", Name:"frontend", UID:"17c1d209-8653-4cac-9680-0cf8d428f114", APIVersion:"v1", ResourceVersion:"1900", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-gz9z5
E0114 01:49:43.430886   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:43.538517   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/redis-slave created
I0114 01:49:43.576325   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966574-26266", Name:"redis-slave", UID:"0026b885-529f-4be8-83f7-ef7d852c69df", APIVersion:"v1", ResourceVersion:"1909", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-5z7sv
I0114 01:49:43.579879   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966574-26266", Name:"redis-slave", UID:"0026b885-529f-4be8-83f7-ef7d852c69df", APIVersion:"v1", ResourceVersion:"1909", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-q4rpl
E0114 01:49:43.652253   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1234: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(Bcore.sh:1238: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(Breplicationcontroller "frontend" deleted
replicationcontroller "redis-slave" deleted
core.sh:1242: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Bcore.sh:1246: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/frontend created
I0114 01:49:44.261877   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966574-26266", Name:"frontend", UID:"1067c4dc-003f-4f28-aa6c-756c7cd17dfd", APIVersion:"v1", ResourceVersion:"1928", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-8g7ln
I0114 01:49:44.265626   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966574-26266", Name:"frontend", UID:"1067c4dc-003f-4f28-aa6c-756c7cd17dfd", APIVersion:"v1", ResourceVersion:"1928", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-mrw65
I0114 01:49:44.265676   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966574-26266", Name:"frontend", UID:"1067c4dc-003f-4f28-aa6c-756c7cd17dfd", APIVersion:"v1", ResourceVersion:"1928", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-q8t5x
E0114 01:49:44.327039   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1249: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(BE0114 01:49:44.432029   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
horizontalpodautoscaler.autoscaling/frontend autoscaled
E0114 01:49:44.539713   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1252: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 70
(Bhorizontalpodautoscaler.autoscaling "frontend" deleted
E0114 01:49:44.653275   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
horizontalpodautoscaler.autoscaling/frontend autoscaled
core.sh:1256: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(Bhorizontalpodautoscaler.autoscaling "frontend" deleted
Error: required flag(s) "max" not set
replicationcontroller "frontend" deleted
core.sh:1265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BapiVersion: apps/v1
kind: Deployment
metadata:
  creationTimestamp: null
... skipping 24 lines ...
          limits:
            cpu: 300m
          requests:
            cpu: 300m
      terminationGracePeriodSeconds: 0
status: {}
E0114 01:49:45.328206   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Error from server (NotFound): deployments.apps "nginx-deployment-resources" not found
E0114 01:49:45.433122   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:45.540973   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment-resources created
I0114 01:49:45.557886   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966574-26266", Name:"nginx-deployment-resources", UID:"cd56ae08-46c8-4fee-a8cc-3104eede63eb", APIVersion:"apps/v1", ResourceVersion:"1950", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-67f8cfff5 to 3
I0114 01:49:45.560952   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966574-26266", Name:"nginx-deployment-resources-67f8cfff5", UID:"38945c0f-43e9-4d21-aa6c-d00804b47bde", APIVersion:"apps/v1", ResourceVersion:"1951", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-67f8cfff5-ttt2p
I0114 01:49:45.563209   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966574-26266", Name:"nginx-deployment-resources-67f8cfff5", UID:"38945c0f-43e9-4d21-aa6c-d00804b47bde", APIVersion:"apps/v1", ResourceVersion:"1951", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-67f8cfff5-vd9kb
I0114 01:49:45.565442   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966574-26266", Name:"nginx-deployment-resources-67f8cfff5", UID:"38945c0f-43e9-4d21-aa6c-d00804b47bde", APIVersion:"apps/v1", ResourceVersion:"1951", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-67f8cfff5-mfrqv
E0114 01:49:45.654623   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1271: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment-resources:
(Bcore.sh:1272: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bcore.sh:1273: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment-resources resource requirements updated
I0114 01:49:45.958203   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966574-26266", Name:"nginx-deployment-resources", UID:"cd56ae08-46c8-4fee-a8cc-3104eede63eb", APIVersion:"apps/v1", ResourceVersion:"1964", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-55c547f795 to 1
I0114 01:49:45.961923   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966574-26266", Name:"nginx-deployment-resources-55c547f795", UID:"98e84c6f-01ab-4551-b273-0a5f1c95638b", APIVersion:"apps/v1", ResourceVersion:"1965", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-55c547f795-7ns74
core.sh:1276: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 100m:
(Bcore.sh:1277: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m:
(Berror: unable to find container named redis
E0114 01:49:46.329468   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment-resources resource requirements updated
I0114 01:49:46.395607   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966574-26266", Name:"nginx-deployment-resources", UID:"cd56ae08-46c8-4fee-a8cc-3104eede63eb", APIVersion:"apps/v1", ResourceVersion:"1974", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-resources-55c547f795 to 0
I0114 01:49:46.400903   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966574-26266", Name:"nginx-deployment-resources-55c547f795", UID:"98e84c6f-01ab-4551-b273-0a5f1c95638b", APIVersion:"apps/v1", ResourceVersion:"1978", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-resources-55c547f795-7ns74
I0114 01:49:46.402762   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966574-26266", Name:"nginx-deployment-resources", UID:"cd56ae08-46c8-4fee-a8cc-3104eede63eb", APIVersion:"apps/v1", ResourceVersion:"1976", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-6d86564b45 to 1
I0114 01:49:46.407476   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966574-26266", Name:"nginx-deployment-resources-6d86564b45", UID:"2a4d5d46-b6d9-43e4-9e9a-839db848c944", APIVersion:"apps/v1", ResourceVersion:"1982", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-6d86564b45-rb4lk
E0114 01:49:46.433996   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1282: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
(BE0114 01:49:46.542211   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1283: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m:
(BE0114 01:49:46.655876   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment-resources resource requirements updated
I0114 01:49:46.744041   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966574-26266", Name:"nginx-deployment-resources", UID:"cd56ae08-46c8-4fee-a8cc-3104eede63eb", APIVersion:"apps/v1", ResourceVersion:"1995", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-resources-67f8cfff5 to 2
I0114 01:49:46.753347   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966574-26266", Name:"nginx-deployment-resources", UID:"cd56ae08-46c8-4fee-a8cc-3104eede63eb", APIVersion:"apps/v1", ResourceVersion:"1997", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-6c478d4fdb to 1
I0114 01:49:46.756431   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966574-26266", Name:"nginx-deployment-resources-67f8cfff5", UID:"38945c0f-43e9-4d21-aa6c-d00804b47bde", APIVersion:"apps/v1", ResourceVersion:"1999", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-resources-67f8cfff5-ttt2p
I0114 01:49:46.760851   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966574-26266", Name:"nginx-deployment-resources-6c478d4fdb", UID:"cc355e36-1641-48a7-afc3-fcf447e93316", APIVersion:"apps/v1", ResourceVersion:"2002", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-6c478d4fdb-kjj9s
core.sh:1286: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
... skipping 73 lines ...
    status: "True"
    type: Progressing
  observedGeneration: 4
  replicas: 4
  unavailableReplicas: 4
  updatedReplicas: 1
error: you must specify resources by --filename when --local is set.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
E0114 01:49:47.330889   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1292: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
(BE0114 01:49:47.435556   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1293: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 300m:
(BE0114 01:49:47.543606   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1294: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.requests.cpu}}:{{end}}: 300m:
(BE0114 01:49:47.656978   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx-deployment-resources" deleted
+++ exit code: 0
Recording: run_deployment_tests
Running command: run_deployment_tests

+++ Running case: test-cmd.run_deployment_tests 
... skipping 7 lines ...
I0114 01:49:48.058419   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966587-24120", Name:"test-nginx-extensions", UID:"4bd0fd9b-cbd0-4bb4-9f4b-427e3da48241", APIVersion:"apps/v1", ResourceVersion:"2033", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-nginx-extensions-5559c76db7 to 1
I0114 01:49:48.065303   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"test-nginx-extensions-5559c76db7", UID:"694350af-8425-441b-8a5b-33d84402f6c5", APIVersion:"apps/v1", ResourceVersion:"2034", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-nginx-extensions-5559c76db7-n4wxt
apps.sh:185: Successful get deploy test-nginx-extensions {{(index .spec.template.spec.containers 0).name}}: nginx
(BSuccessful
message:10
has not:2
E0114 01:49:48.332141   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:apps/v1
has:apps/v1
E0114 01:49:48.437534   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "test-nginx-extensions" deleted
deployment.apps/test-nginx-apps created
I0114 01:49:48.540400   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966587-24120", Name:"test-nginx-apps", UID:"426c1e4f-a7dd-4d04-a6a0-1716f4624b8d", APIVersion:"apps/v1", ResourceVersion:"2047", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-nginx-apps-79b9bd9585 to 1
I0114 01:49:48.542971   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"test-nginx-apps-79b9bd9585", UID:"986bc2da-8184-4a50-8c9a-68dfc3443f01", APIVersion:"apps/v1", ResourceVersion:"2048", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-nginx-apps-79b9bd9585-qfx6m
E0114 01:49:48.545089   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:198: Successful get deploy test-nginx-apps {{(index .spec.template.spec.containers 0).name}}: nginx
(BE0114 01:49:48.658332   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:10
has:10
Successful
message:apps/v1
has:apps/v1
... skipping 13 lines ...
                pod-template-hash=79b9bd9585
Annotations:    deployment.kubernetes.io/desired-replicas: 1
                deployment.kubernetes.io/max-replicas: 2
                deployment.kubernetes.io/revision: 1
Controlled By:  Deployment/test-nginx-apps
Replicas:       1 current / 1 desired
Pods Status:    0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=test-nginx-apps
           pod-template-hash=79b9bd9585
  Containers:
   nginx:
    Image:        k8s.gcr.io/nginx:test-cmd
... skipping 36 lines ...
Node-Selectors:   <none>
Tolerations:      <none>
Events:           <none>
(Bdeployment.apps "test-nginx-apps" deleted
apps.sh:214: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx-with-command created
E0114 01:49:49.333131   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 01:49:49.336212   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966587-24120", Name:"nginx-with-command", UID:"5f3092f3-8a42-4d21-a175-a3cfdbcd2bfa", APIVersion:"apps/v1", ResourceVersion:"2064", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-with-command-757c6f58dd to 1
I0114 01:49:49.340565   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx-with-command-757c6f58dd", UID:"6682023f-79c8-4588-bc27-024be55b96b6", APIVersion:"apps/v1", ResourceVersion:"2065", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-with-command-757c6f58dd-4jvs8
apps.sh:218: Successful get deploy nginx-with-command {{(index .spec.template.spec.containers 0).name}}: nginx
(BE0114 01:49:49.438794   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx-with-command" deleted
E0114 01:49:49.546232   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:224: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 01:49:49.659553   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/deployment-with-unixuserid created
I0114 01:49:49.817747   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966587-24120", Name:"deployment-with-unixuserid", UID:"67b26529-f833-44c4-af81-8986128ebdd1", APIVersion:"apps/v1", ResourceVersion:"2078", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set deployment-with-unixuserid-8fcdfc94f to 1
I0114 01:49:49.820862   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"deployment-with-unixuserid-8fcdfc94f", UID:"d71239f5-b9ae-4781-94ab-1b3ec23cd2ee", APIVersion:"apps/v1", ResourceVersion:"2079", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: deployment-with-unixuserid-8fcdfc94f-7lkrw
apps.sh:228: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: deployment-with-unixuserid:
(Bdeployment.apps "deployment-with-unixuserid" deleted
apps.sh:235: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx-deployment created
I0114 01:49:50.333025   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment", UID:"61999592-7c25-4abb-95d5-d6e467152c99", APIVersion:"apps/v1", ResourceVersion:"2092", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
E0114 01:49:50.334374   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 01:49:50.336220   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment-6986c7bc94", UID:"ebfba31a-00c1-4496-a836-49269e41ec8e", APIVersion:"apps/v1", ResourceVersion:"2093", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-spgzr
I0114 01:49:50.341152   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment-6986c7bc94", UID:"ebfba31a-00c1-4496-a836-49269e41ec8e", APIVersion:"apps/v1", ResourceVersion:"2093", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-fc5xz
I0114 01:49:50.341222   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment-6986c7bc94", UID:"ebfba31a-00c1-4496-a836-49269e41ec8e", APIVersion:"apps/v1", ResourceVersion:"2093", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-hpdr7
E0114 01:49:50.439892   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:239: Successful get rs {{range.items}}{{.spec.replicas}}{{end}}: 3
(Bdeployment.apps "nginx-deployment" deleted
E0114 01:49:50.547546   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:242: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 01:49:50.660803   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:246: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:247: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx-deployment created
I0114 01:49:50.912472   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment", UID:"ce2199cb-3af0-4742-9881-8b2201e1f4e4", APIVersion:"apps/v1", ResourceVersion:"2114", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-7f6fc565b9 to 1
I0114 01:49:50.915733   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment-7f6fc565b9", UID:"43283c09-8a3b-40ff-8b06-a4a15a854613", APIVersion:"apps/v1", ResourceVersion:"2115", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-7f6fc565b9-84hgx
apps.sh:251: Successful get rs {{range.items}}{{.spec.replicas}}{{end}}: 1
(Bdeployment.apps "nginx-deployment" deleted
apps.sh:256: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:257: Successful get rs {{range.items}}{{.spec.replicas}}{{end}}: 1
(BE0114 01:49:51.335484   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:51.441078   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps "nginx-deployment-7f6fc565b9" deleted
E0114 01:49:51.548686   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 01:49:51.661972   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment created
I0114 01:49:51.770798   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment", UID:"c1255431-b8ac-4eb8-91a0-d43c2007a359", APIVersion:"apps/v1", ResourceVersion:"2134", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
I0114 01:49:51.776510   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment-6986c7bc94", UID:"532a15b5-b884-47ed-b2ba-938c506c758e", APIVersion:"apps/v1", ResourceVersion:"2135", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-x855j
I0114 01:49:51.784225   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment-6986c7bc94", UID:"532a15b5-b884-47ed-b2ba-938c506c758e", APIVersion:"apps/v1", ResourceVersion:"2135", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-xzd9k
I0114 01:49:51.784278   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment-6986c7bc94", UID:"532a15b5-b884-47ed-b2ba-938c506c758e", APIVersion:"apps/v1", ResourceVersion:"2135", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-jmqcw
apps.sh:268: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment:
(Bhorizontalpodautoscaler.autoscaling/nginx-deployment autoscaled
apps.sh:271: Successful get hpa nginx-deployment {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(Bhorizontalpodautoscaler.autoscaling "nginx-deployment" deleted
deployment.apps "nginx-deployment" deleted
E0114 01:49:52.336700   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:279: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 01:49:52.442248   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx created
I0114 01:49:52.538062   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966587-24120", Name:"nginx", UID:"42ffc995-4520-44fd-b0fa-5d35d2a36110", APIVersion:"apps/v1", ResourceVersion:"2158", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-f87d999f7 to 3
I0114 01:49:52.541554   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx-f87d999f7", UID:"0f1a64e2-268c-4bbb-9f57-9409dada3e0e", APIVersion:"apps/v1", ResourceVersion:"2159", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-kp85k
I0114 01:49:52.544943   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx-f87d999f7", UID:"0f1a64e2-268c-4bbb-9f57-9409dada3e0e", APIVersion:"apps/v1", ResourceVersion:"2159", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-7x2vv
I0114 01:49:52.545020   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx-f87d999f7", UID:"0f1a64e2-268c-4bbb-9f57-9409dada3e0e", APIVersion:"apps/v1", ResourceVersion:"2159", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-dqcjb
E0114 01:49:52.549752   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:283: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(BE0114 01:49:52.663430   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:284: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bdeployment.apps/nginx skipped rollback (current template already matches revision 1)
apps.sh:287: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BWarning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
deployment.apps/nginx configured
I0114 01:49:53.124790   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966587-24120", Name:"nginx", UID:"42ffc995-4520-44fd-b0fa-5d35d2a36110", APIVersion:"apps/v1", ResourceVersion:"2174", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-78487f9fd7 to 1
I0114 01:49:53.129969   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx-78487f9fd7", UID:"1bec8177-f7fa-48ff-a280-707204f89d82", APIVersion:"apps/v1", ResourceVersion:"2175", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-78487f9fd7-klhbm
apps.sh:290: Successful get deployment.apps {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(BE0114 01:49:53.337939   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
    Image:	k8s.gcr.io/nginx:test-cmd
E0114 01:49:53.443421   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:293: Successful get deployment.apps {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(BE0114 01:49:53.550945   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx rolled back
E0114 01:49:53.664649   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:54.339302   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:54.444746   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:54.552207   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:297: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0114 01:49:54.665784   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
error: unable to find specified revision 1000000 in history
apps.sh:300: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bdeployment.apps/nginx rolled back
E0114 01:49:55.340453   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:55.445956   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:55.553484   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:55.667159   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:304: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bdeployment.apps/nginx paused
error: you cannot rollback a paused deployment; resume it first with 'kubectl rollout resume deployment/nginx' and try again
E0114 01:49:56.341744   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
error: deployments.apps "nginx" can't restart paused deployment (run rollout resume first)
E0114 01:49:56.447106   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx resumed
E0114 01:49:56.554879   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx rolled back
E0114 01:49:56.668411   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
    deployment.kubernetes.io/revision-history: 1,3
error: desired revision (3) is different from the running revision (5)
deployment.apps/nginx restarted
I0114 01:49:57.125780   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966587-24120", Name:"nginx", UID:"42ffc995-4520-44fd-b0fa-5d35d2a36110", APIVersion:"apps/v1", ResourceVersion:"2206", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-f87d999f7 to 2
I0114 01:49:57.131246   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966587-24120", Name:"nginx", UID:"42ffc995-4520-44fd-b0fa-5d35d2a36110", APIVersion:"apps/v1", ResourceVersion:"2209", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-5ff4697db8 to 1
I0114 01:49:57.133714   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx-f87d999f7", UID:"0f1a64e2-268c-4bbb-9f57-9409dada3e0e", APIVersion:"apps/v1", ResourceVersion:"2210", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-f87d999f7-dqcjb
I0114 01:49:57.141637   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx-5ff4697db8", UID:"574c3235-26cc-4523-863c-dae3389c4b2c", APIVersion:"apps/v1", ResourceVersion:"2213", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-5ff4697db8-brzms
E0114 01:49:57.343155   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:57.448294   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:57.556180   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:57.669800   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:apiVersion: apps/v1
kind: ReplicaSet
metadata:
  annotations:
    deployment.kubernetes.io/desired-replicas: "3"
... skipping 48 lines ...
      terminationGracePeriodSeconds: 30
status:
  fullyLabeledReplicas: 1
  observedGeneration: 2
  replicas: 1
has:deployment.kubernetes.io/revision: "6"
E0114 01:49:58.344285   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:49:58.449739   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx2 created
I0114 01:49:58.486984   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966587-24120", Name:"nginx2", UID:"f135d8e8-048f-4d4d-9916-79a2ad695afc", APIVersion:"apps/v1", ResourceVersion:"2226", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx2-57b7865cd9 to 3
I0114 01:49:58.491877   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx2-57b7865cd9", UID:"a1562937-755a-46ed-b04e-10317f9716b4", APIVersion:"apps/v1", ResourceVersion:"2227", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-57b7865cd9-2vprc
I0114 01:49:58.494426   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx2-57b7865cd9", UID:"a1562937-755a-46ed-b04e-10317f9716b4", APIVersion:"apps/v1", ResourceVersion:"2227", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-57b7865cd9-cc9fv
I0114 01:49:58.496391   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx2-57b7865cd9", UID:"a1562937-755a-46ed-b04e-10317f9716b4", APIVersion:"apps/v1", ResourceVersion:"2227", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-57b7865cd9-6nc7z
E0114 01:49:58.557391   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx2" deleted
E0114 01:49:58.671264   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx" deleted
apps.sh:334: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx-deployment created
I0114 01:49:58.984421   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment", UID:"27f0068e-82eb-4c53-b96e-28e0371894c3", APIVersion:"apps/v1", ResourceVersion:"2263", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-598d4d68b4 to 3
I0114 01:49:58.987584   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment-598d4d68b4", UID:"b7c9423e-cf14-4081-a8c3-e41c22472ab3", APIVersion:"apps/v1", ResourceVersion:"2265", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-5rxbd
I0114 01:49:58.990142   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment-598d4d68b4", UID:"b7c9423e-cf14-4081-a8c3-e41c22472ab3", APIVersion:"apps/v1", ResourceVersion:"2265", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-t6xq8
I0114 01:49:58.992740   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment-598d4d68b4", UID:"b7c9423e-cf14-4081-a8c3-e41c22472ab3", APIVersion:"apps/v1", ResourceVersion:"2265", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-bng92
apps.sh:337: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment:
(Bapps.sh:338: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:339: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(BE0114 01:49:59.345560   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment image updated
I0114 01:49:59.389618   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment", UID:"27f0068e-82eb-4c53-b96e-28e0371894c3", APIVersion:"apps/v1", ResourceVersion:"2279", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-59df9b5f5b to 1
I0114 01:49:59.395403   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment-59df9b5f5b", UID:"dd3897a8-6560-465a-a25b-8c5093fd3f22", APIVersion:"apps/v1", ResourceVersion:"2280", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-59df9b5f5b-zzwcv
E0114 01:49:59.451083   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 01:49:59.460834   54459 horizontal.go:353] Horizontal Pod Autoscaler frontend has been deleted in namespace-1578966574-26266
apps.sh:342: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(BE0114 01:49:59.558817   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:343: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(BE0114 01:49:59.672539   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
error: unable to find container named "redis"
deployment.apps/nginx-deployment image updated
apps.sh:348: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:349: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment image updated
apps.sh:352: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bapps.sh:353: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(BE0114 01:50:00.347081   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:00.452275   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:356: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(BE0114 01:50:00.560066   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:357: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(BE0114 01:50:00.673654   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment image updated
I0114 01:50:00.696214   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment", UID:"27f0068e-82eb-4c53-b96e-28e0371894c3", APIVersion:"apps/v1", ResourceVersion:"2297", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-598d4d68b4 to 2
I0114 01:50:00.702967   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment-598d4d68b4", UID:"b7c9423e-cf14-4081-a8c3-e41c22472ab3", APIVersion:"apps/v1", ResourceVersion:"2301", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-598d4d68b4-5rxbd
I0114 01:50:00.704830   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment", UID:"27f0068e-82eb-4c53-b96e-28e0371894c3", APIVersion:"apps/v1", ResourceVersion:"2299", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-7d758dbc54 to 1
I0114 01:50:00.709603   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment-7d758dbc54", UID:"5963a56e-388f-4ca4-8dc2-8c65f9708123", APIVersion:"apps/v1", ResourceVersion:"2305", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-7d758dbc54-qm4xs
apps.sh:360: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:361: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:364: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:365: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bdeployment.apps "nginx-deployment" deleted
E0114 01:50:01.348211   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:371: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 01:50:01.453398   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:01.561327   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment created
I0114 01:50:01.591336   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment", UID:"500805a2-8a02-4319-8470-2b12409b9479", APIVersion:"apps/v1", ResourceVersion:"2332", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-598d4d68b4 to 3
I0114 01:50:01.606729   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment-598d4d68b4", UID:"66735bdd-b183-40fa-9304-3fcb1741fb80", APIVersion:"apps/v1", ResourceVersion:"2333", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-gt79f
I0114 01:50:01.626515   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment-598d4d68b4", UID:"66735bdd-b183-40fa-9304-3fcb1741fb80", APIVersion:"apps/v1", ResourceVersion:"2333", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-jjvqt
I0114 01:50:01.626958   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment-598d4d68b4", UID:"66735bdd-b183-40fa-9304-3fcb1741fb80", APIVersion:"apps/v1", ResourceVersion:"2333", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-9cfhh
E0114 01:50:01.674911   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
configmap/test-set-env-config created
secret/test-set-env-secret created
apps.sh:376: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment:
(Bapps.sh:378: Successful get configmaps/test-set-env-config {{.metadata.name}}: test-set-env-config
(Bapps.sh:379: Successful get secret {{range.items}}{{.metadata.name}}:{{end}}: test-set-env-secret:
(BE0114 01:50:02.349399   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment env updated
I0114 01:50:02.422973   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment", UID:"500805a2-8a02-4319-8470-2b12409b9479", APIVersion:"apps/v1", ResourceVersion:"2348", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6b9f7756b4 to 1
I0114 01:50:02.430002   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment-6b9f7756b4", UID:"052084e8-6052-49d7-a043-9db250e5e982", APIVersion:"apps/v1", ResourceVersion:"2349", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6b9f7756b4-bqvq6
E0114 01:50:02.454185   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:383: Successful get deploy nginx-deployment {{ (index (index .spec.template.spec.containers 0).env 0).name}}: KEY_2
(BE0114 01:50:02.562931   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:385: Successful get deploy nginx-deployment {{ len (index .spec.template.spec.containers 0).env }}: 1
(BE0114 01:50:02.676071   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment env updated
I0114 01:50:02.742613   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment", UID:"500805a2-8a02-4319-8470-2b12409b9479", APIVersion:"apps/v1", ResourceVersion:"2358", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-598d4d68b4 to 2
I0114 01:50:02.749499   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment-598d4d68b4", UID:"66735bdd-b183-40fa-9304-3fcb1741fb80", APIVersion:"apps/v1", ResourceVersion:"2361", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-598d4d68b4-gt79f
I0114 01:50:02.754221   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment", UID:"500805a2-8a02-4319-8470-2b12409b9479", APIVersion:"apps/v1", ResourceVersion:"2360", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-754bf964c8 to 1
I0114 01:50:02.757802   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment-754bf964c8", UID:"6f915d14-81f9-4055-8a13-a08ccb738767", APIVersion:"apps/v1", ResourceVersion:"2368", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-754bf964c8-ds2zh
apps.sh:389: Successful get deploy nginx-deployment {{ len (index .spec.template.spec.containers 0).env }}: 2
... skipping 7 lines ...
I0114 01:50:03.087345   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment-598d4d68b4", UID:"66735bdd-b183-40fa-9304-3fcb1741fb80", APIVersion:"apps/v1", ResourceVersion:"2404", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-598d4d68b4-9cfhh
I0114 01:50:03.089472   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment", UID:"500805a2-8a02-4319-8470-2b12409b9479", APIVersion:"apps/v1", ResourceVersion:"2403", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-5958f7687 to 1
I0114 01:50:03.128261   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment-5958f7687", UID:"04988a4c-aea4-4d73-89d1-70884e2e6d06", APIVersion:"apps/v1", ResourceVersion:"2408", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-5958f7687-wqrm2
deployment.apps/nginx-deployment env updated
I0114 01:50:03.224096   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment", UID:"500805a2-8a02-4319-8470-2b12409b9479", APIVersion:"apps/v1", ResourceVersion:"2417", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-6b9f7756b4 to 0
deployment.apps/nginx-deployment env updated
E0114 01:50:03.350730   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 01:50:03.380623   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment-6b9f7756b4", UID:"052084e8-6052-49d7-a043-9db250e5e982", APIVersion:"apps/v1", ResourceVersion:"2420", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-6b9f7756b4-bqvq6
deployment.apps/nginx-deployment env updated
I0114 01:50:03.423693   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment", UID:"500805a2-8a02-4319-8470-2b12409b9479", APIVersion:"apps/v1", ResourceVersion:"2423", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-d74969475 to 1
E0114 01:50:03.455540   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx-deployment" deleted
E0114 01:50:03.564220   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 01:50:03.576823   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966587-24120", Name:"nginx-deployment-d74969475", UID:"349166f5-3bec-49f8-ad19-ed31a1c82d5e", APIVersion:"apps/v1", ResourceVersion:"2429", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-d74969475-kjjx4
configmap "test-set-env-config" deleted
E0114 01:50:03.676914   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret "test-set-env-secret" deleted
+++ exit code: 0
Recording: run_rs_tests
Running command: run_rs_tests

+++ Running case: test-cmd.run_rs_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
E0114 01:50:03.775680   54459 replica_set.go:534] sync "namespace-1578966587-24120/nginx-deployment-6b9f7756b4" failed with replicasets.apps "nginx-deployment-6b9f7756b4" not found
+++ command: run_rs_tests
+++ [0114 01:50:03] Creating namespace namespace-1578966603-1693
E0114 01:50:03.825566   54459 replica_set.go:534] sync "namespace-1578966587-24120/nginx-deployment-d74969475" failed with replicasets.apps "nginx-deployment-d74969475" not found
namespace/namespace-1578966603-1693 created
Context "test" modified.
+++ [0114 01:50:03] Testing kubectl(v1:replicasets)
apps.sh:511: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicaset.apps/frontend created
I0114 01:50:04.219434   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966603-1693", Name:"frontend", UID:"ccf03871-a53d-4c08-90fa-d7530a33b248", APIVersion:"apps/v1", ResourceVersion:"2456", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-8wbt5
I0114 01:50:04.223509   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966603-1693", Name:"frontend", UID:"ccf03871-a53d-4c08-90fa-d7530a33b248", APIVersion:"apps/v1", ResourceVersion:"2456", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-jgz8h
I0114 01:50:04.223548   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966603-1693", Name:"frontend", UID:"ccf03871-a53d-4c08-90fa-d7530a33b248", APIVersion:"apps/v1", ResourceVersion:"2456", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-rxd9t
+++ [0114 01:50:04] Deleting rs
replicaset.apps "frontend" deleted
E0114 01:50:04.327527   54459 replica_set.go:534] sync "namespace-1578966603-1693/frontend" failed with Operation cannot be fulfilled on replicasets.apps "frontend": StorageError: invalid object, Code: 4, Key: /registry/replicasets/namespace-1578966603-1693/frontend, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: ccf03871-a53d-4c08-90fa-d7530a33b248, UID in object meta: 
E0114 01:50:04.351944   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:517: Successful get pods -l "tier=frontend" {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 01:50:04.456890   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:521: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 01:50:04.565453   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:04.677999   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend created
I0114 01:50:04.696447   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966603-1693", Name:"frontend", UID:"893c81bd-30f2-4f64-bba4-67dc1754181e", APIVersion:"apps/v1", ResourceVersion:"2471", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-f47p7
I0114 01:50:04.698622   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966603-1693", Name:"frontend", UID:"893c81bd-30f2-4f64-bba4-67dc1754181e", APIVersion:"apps/v1", ResourceVersion:"2471", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-sjnmw
I0114 01:50:04.700544   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966603-1693", Name:"frontend", UID:"893c81bd-30f2-4f64-bba4-67dc1754181e", APIVersion:"apps/v1", ResourceVersion:"2471", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-s5n5t
apps.sh:525: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
(B+++ [0114 01:50:04] Deleting rs
replicaset.apps "frontend" deleted
E0114 01:50:04.976758   54459 replica_set.go:534] sync "namespace-1578966603-1693/frontend" failed with replicasets.apps "frontend" not found
apps.sh:529: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:531: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
(Bpod "frontend-f47p7" deleted
pod "frontend-s5n5t" deleted
pod "frontend-sjnmw" deleted
apps.sh:534: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 01:50:05.353169   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:538: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 01:50:05.458140   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:05.566672   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend created
I0114 01:50:05.588064   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966603-1693", Name:"frontend", UID:"430618bd-21d9-4735-a2bc-8e1cfe713fa5", APIVersion:"apps/v1", ResourceVersion:"2494", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-d7wb8
I0114 01:50:05.590869   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966603-1693", Name:"frontend", UID:"430618bd-21d9-4735-a2bc-8e1cfe713fa5", APIVersion:"apps/v1", ResourceVersion:"2494", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-4wpmx
I0114 01:50:05.591740   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966603-1693", Name:"frontend", UID:"430618bd-21d9-4735-a2bc-8e1cfe713fa5", APIVersion:"apps/v1", ResourceVersion:"2494", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-pdwnc
E0114 01:50:05.679227   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:542: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Bmatched Name:
matched Pod Template:
matched Labels:
matched Selector:
matched Replicas:
... skipping 4 lines ...
Namespace:    namespace-1578966603-1693
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
Namespace:    namespace-1578966603-1693
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 18 lines ...
Namespace:    namespace-1578966603-1693
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 12 lines ...
Namespace:    namespace-1578966603-1693
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 25 lines ...
Namespace:    namespace-1578966603-1693
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 9 lines ...
Events:
  Type    Reason            Age   From                   Message
  ----    ------            ----  ----                   -------
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-d7wb8
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-4wpmx
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-pdwnc
(BE0114 01:50:06.354672   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:06.459361   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:         frontend
Namespace:    namespace-1578966603-1693
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 9 lines ...
Events:
  Type    Reason            Age   From                   Message
  ----    ------            ----  ----                   -------
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-d7wb8
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-4wpmx
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-pdwnc
(BE0114 01:50:06.567758   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:         frontend
Namespace:    namespace-1578966603-1693
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 3 lines ...
      cpu:     100m
      memory:  100Mi
    Environment:
      GET_HOSTS_FROM:  dns
    Mounts:            <none>
  Volumes:             <none>
(BE0114 01:50:06.680420   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:         frontend
Namespace:    namespace-1578966603-1693
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 107 lines ...
(Bapps.sh:564: Successful get rs frontend {{.spec.replicas}}: 3
(BI0114 01:50:06.975611   54459 horizontal.go:353] Horizontal Pod Autoscaler nginx-deployment has been deleted in namespace-1578966587-24120
replicaset.apps/frontend scaled
E0114 01:50:07.049463   54459 replica_set.go:199] ReplicaSet has no controller: &ReplicaSet{ObjectMeta:{frontend  namespace-1578966603-1693 /apis/apps/v1/namespaces/namespace-1578966603-1693/replicasets/frontend 430618bd-21d9-4735-a2bc-8e1cfe713fa5 2505 2 2020-01-14 01:50:05 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  [{kubectl Update apps/v1 2020-01-14 01:50:05 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 109 101 116 97 100 97 116 97 34 58 123 34 102 58 108 97 98 101 108 115 34 58 123 34 102 58 97 112 112 34 58 123 125 44 34 102 58 116 105 101 114 34 58 123 125 44 34 46 34 58 123 125 125 125 44 34 102 58 115 112 101 99 34 58 123 34 102 58 114 101 112 108 105 99 97 115 34 58 123 125 44 34 102 58 115 101 108 101 99 116 111 114 34 58 123 34 102 58 109 97 116 99 104 76 97 98 101 108 115 34 58 123 34 102 58 97 112 112 34 58 123 125 44 34 102 58 116 105 101 114 34 58 123 125 44 34 46 34 58 123 125 125 125 44 34 102 58 116 101 109 112 108 97 116 101 34 58 123 34 102 58 109 101 116 97 100 97 116 97 34 58 123 34 102 58 108 97 98 101 108 115 34 58 123 34 102 58 97 112 112 34 58 123 125 44 34 102 58 116 105 101 114 34 58 123 125 44 34 46 34 58 123 125 125 125 44 34 102 58 115 112 101 99 34 58 123 34 102 58 99 111 110 116 97 105 110 101 114 115 34 58 123 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 112 104 112 45 114 101 100 105 115 92 34 125 34 58 123 34 102 58 101 110 118 34 58 123 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 71 69 84 95 72 79 83 84 83 95 70 82 79 77 92 34 125 34 58 123 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 105 109 97 103 101 34 58 123 125 44 34 102 58 105 109 97 103 101 80 117 108 108 80 111 108 105 99 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 112 111 114 116 115 34 58 123 34 107 58 123 92 34 99 111 110 116 97 105 110 101 114 80 111 114 116 92 34 58 56 48 44 92 34 112 114 111 116 111 99 111 108 92 34 58 92 34 84 67 80 92 34 125 34 58 123 34 102 58 99 111 110 116 97 105 110 101 114 80 111 114 116 34 58 123 125 44 34 102 58 112 114 111 116 111 99 111 108 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 114 101 115 111 117 114 99 101 115 34 58 123 34 102 58 114 101 113 117 101 115 116 115 34 58 123 34 102 58 99 112 117 34 58 123 125 44 34 102 58 109 101 109 111 114 121 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 97 116 104 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 111 108 105 99 121 34 58 123 125 44 34 46 34 58 123 125 125 125 44 34 102 58 100 110 115 80 111 108 105 99 121 34 58 123 125 44 34 102 58 114 101 115 116 97 114 116 80 111 108 105 99 121 34 58 123 125 44 34 102 58 115 99 104 101 100 117 108 101 114 78 97 109 101 34 58 123 125 44 34 102 58 115 101 99 117 114 105 116 121 67 111 110 116 101 120 116 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 71 114 97 99 101 80 101 114 105 111 100 83 101 99 111 110 100 115 34 58 123 125 125 125 125 125],}} {kube-controller-manager Update apps/v1 2020-01-14 01:50:05 +0000 UTC FieldsV1 &FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 102 117 108 108 121 76 97 98 101 108 101 100 82 101 112 108 105 99 97 115 34 58 123 125 44 34 102 58 111 98 115 101 114 118 101 100 71 101 110 101 114 97 116 105 111 110 34 58 123 125 44 34 102 58 114 101 112 108 105 99 97 115 34 58 123 125 125 125],}}]},Spec:ReplicaSetSpec{Replicas:*2,Selector:&v1.LabelSelector{MatchLabels:map[string]string{app: guestbook,tier: frontend,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{      0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []} {[] [] [{php-redis gcr.io/google_samples/gb-frontend:v3 [] []  [{ 0 80 TCP }] [] [{GET_HOSTS_FROM dns nil}] {map[] map[cpu:{{100 -3} {<nil>} 100m DecimalSI} memory:{{104857600 0} {<nil>} 100Mi BinarySI}]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent nil false false false}] [] Always 0xc003ca7a28 <nil> ClusterFirst map[]   <nil>  false false false <nil> PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,} []   nil default-scheduler [] []  <nil> nil [] <nil> <nil> <nil> map[] []}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:3,FullyLabeledReplicas:3,ObservedGeneration:1,ReadyReplicas:0,AvailableReplicas:0,Conditions:[]ReplicaSetCondition{},},}
I0114 01:50:07.054580   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966603-1693", Name:"frontend", UID:"430618bd-21d9-4735-a2bc-8e1cfe713fa5", APIVersion:"apps/v1", ResourceVersion:"2505", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-d7wb8
apps.sh:568: Successful get rs frontend {{.spec.replicas}}: 2
(BE0114 01:50:07.355930   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/scale-1 created
I0114 01:50:07.402696   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966603-1693", Name:"scale-1", UID:"c0941e96-e11d-4f5c-a57a-6debf370f1dd", APIVersion:"apps/v1", ResourceVersion:"2511", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-1-5c5565bcd9 to 1
I0114 01:50:07.407506   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966603-1693", Name:"scale-1-5c5565bcd9", UID:"cd7f27f2-cfa5-4487-92f7-7a5b3095f4df", APIVersion:"apps/v1", ResourceVersion:"2512", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-1-5c5565bcd9-ppxmr
E0114 01:50:07.460456   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:07.569160   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/scale-2 created
I0114 01:50:07.606815   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966603-1693", Name:"scale-2", UID:"52d949bc-8331-4286-ba23-8faf5cf505ad", APIVersion:"apps/v1", ResourceVersion:"2521", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-2-5c5565bcd9 to 1
I0114 01:50:07.611498   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966603-1693", Name:"scale-2-5c5565bcd9", UID:"3fc83277-b616-470a-b04c-5e7c06ea5ebb", APIVersion:"apps/v1", ResourceVersion:"2522", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-2-5c5565bcd9-xz8zw
E0114 01:50:07.681723   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/scale-3 created
I0114 01:50:07.831748   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966603-1693", Name:"scale-3", UID:"8d4295c1-ad3f-442b-90f8-b5c02abcca7b", APIVersion:"apps/v1", ResourceVersion:"2531", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-3-5c5565bcd9 to 1
I0114 01:50:07.865880   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966603-1693", Name:"scale-3-5c5565bcd9", UID:"1d00b3ea-ba15-47b1-9433-f3aaa69a75a9", APIVersion:"apps/v1", ResourceVersion:"2532", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-5c5565bcd9-xhg4t
apps.sh:574: Successful get deploy scale-1 {{.spec.replicas}}: 1
(Bapps.sh:575: Successful get deploy scale-2 {{.spec.replicas}}: 1
(Bapps.sh:576: Successful get deploy scale-3 {{.spec.replicas}}: 1
(Bdeployment.apps/scale-1 scaled
I0114 01:50:08.288536   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966603-1693", Name:"scale-1", UID:"c0941e96-e11d-4f5c-a57a-6debf370f1dd", APIVersion:"apps/v1", ResourceVersion:"2541", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-1-5c5565bcd9 to 2
deployment.apps/scale-2 scaled
I0114 01:50:08.291252   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966603-1693", Name:"scale-1-5c5565bcd9", UID:"cd7f27f2-cfa5-4487-92f7-7a5b3095f4df", APIVersion:"apps/v1", ResourceVersion:"2542", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-1-5c5565bcd9-pc74r
I0114 01:50:08.293858   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966603-1693", Name:"scale-2", UID:"52d949bc-8331-4286-ba23-8faf5cf505ad", APIVersion:"apps/v1", ResourceVersion:"2543", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-2-5c5565bcd9 to 2
I0114 01:50:08.300092   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966603-1693", Name:"scale-2-5c5565bcd9", UID:"3fc83277-b616-470a-b04c-5e7c06ea5ebb", APIVersion:"apps/v1", ResourceVersion:"2547", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-2-5c5565bcd9-x4drv
E0114 01:50:08.357042   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:579: Successful get deploy scale-1 {{.spec.replicas}}: 2
(BE0114 01:50:08.461941   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:580: Successful get deploy scale-2 {{.spec.replicas}}: 2
(BE0114 01:50:08.570586   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:581: Successful get deploy scale-3 {{.spec.replicas}}: 1
(BE0114 01:50:08.682758   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/scale-1 scaled
I0114 01:50:08.690982   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966603-1693", Name:"scale-1", UID:"c0941e96-e11d-4f5c-a57a-6debf370f1dd", APIVersion:"apps/v1", ResourceVersion:"2561", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-1-5c5565bcd9 to 3
deployment.apps/scale-2 scaled
I0114 01:50:08.694769   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966603-1693", Name:"scale-1-5c5565bcd9", UID:"cd7f27f2-cfa5-4487-92f7-7a5b3095f4df", APIVersion:"apps/v1", ResourceVersion:"2562", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-1-5c5565bcd9-gknq4
I0114 01:50:08.697998   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966603-1693", Name:"scale-2", UID:"52d949bc-8331-4286-ba23-8faf5cf505ad", APIVersion:"apps/v1", ResourceVersion:"2563", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-2-5c5565bcd9 to 3
deployment.apps/scale-3 scaled
... skipping 5 lines ...
(Bapps.sh:585: Successful get deploy scale-2 {{.spec.replicas}}: 3
(Bapps.sh:586: Successful get deploy scale-3 {{.spec.replicas}}: 3
(Breplicaset.apps "frontend" deleted
deployment.apps "scale-1" deleted
deployment.apps "scale-2" deleted
deployment.apps "scale-3" deleted
E0114 01:50:09.358109   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend created
I0114 01:50:09.389416   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966603-1693", Name:"frontend", UID:"39dcb68a-5800-4dbf-b862-1805ec0d6b66", APIVersion:"apps/v1", ResourceVersion:"2625", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-lcd5t
I0114 01:50:09.392903   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966603-1693", Name:"frontend", UID:"39dcb68a-5800-4dbf-b862-1805ec0d6b66", APIVersion:"apps/v1", ResourceVersion:"2625", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-nl22d
I0114 01:50:09.393133   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966603-1693", Name:"frontend", UID:"39dcb68a-5800-4dbf-b862-1805ec0d6b66", APIVersion:"apps/v1", ResourceVersion:"2625", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-fkxfm
E0114 01:50:09.463253   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:594: Successful get rs frontend {{.spec.replicas}}: 3
(BE0114 01:50:09.571728   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/frontend exposed
E0114 01:50:09.683842   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:598: Successful get service frontend {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(Bservice/frontend-2 exposed
apps.sh:602: Successful get service frontend-2 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: default 80
(Bservice "frontend" deleted
service "frontend-2" deleted
apps.sh:608: Successful get rs frontend {{.metadata.generation}}: 1
(Breplicaset.apps/frontend image updated
apps.sh:610: Successful get rs frontend {{.metadata.generation}}: 2
(BE0114 01:50:10.359245   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend env updated
E0114 01:50:10.464430   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:612: Successful get rs frontend {{.metadata.generation}}: 3
(BE0114 01:50:10.572870   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend resource requirements updated
apps.sh:614: Successful get rs frontend {{.metadata.generation}}: 4
(BE0114 01:50:10.685016   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:618: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Breplicaset.apps "frontend" deleted
apps.sh:622: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:626: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicaset.apps/frontend created
I0114 01:50:11.257209   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966603-1693", Name:"frontend", UID:"5eef3a7c-989a-4dcf-8a85-e64f84ff8563", APIVersion:"apps/v1", ResourceVersion:"2661", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-pbcft
I0114 01:50:11.260833   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966603-1693", Name:"frontend", UID:"5eef3a7c-989a-4dcf-8a85-e64f84ff8563", APIVersion:"apps/v1", ResourceVersion:"2661", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-g7vfx
I0114 01:50:11.261037   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966603-1693", Name:"frontend", UID:"5eef3a7c-989a-4dcf-8a85-e64f84ff8563", APIVersion:"apps/v1", ResourceVersion:"2661", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-f4svr
E0114 01:50:11.362347   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/redis-slave created
I0114 01:50:11.454834   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966603-1693", Name:"redis-slave", UID:"cf92c534-255a-4ec1-8455-5c16aab46e6b", APIVersion:"apps/v1", ResourceVersion:"2670", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-xb5t8
I0114 01:50:11.460161   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966603-1693", Name:"redis-slave", UID:"cf92c534-255a-4ec1-8455-5c16aab46e6b", APIVersion:"apps/v1", ResourceVersion:"2670", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-fv2p8
E0114 01:50:11.465357   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:631: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(BE0114 01:50:11.574095   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:635: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(BE0114 01:50:11.686486   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps "frontend" deleted
replicaset.apps "redis-slave" deleted
apps.sh:639: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:644: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicaset.apps/frontend created
I0114 01:50:12.154591   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966603-1693", Name:"frontend", UID:"ba99f22c-01ae-4994-8a7b-2892c4e2b430", APIVersion:"apps/v1", ResourceVersion:"2689", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-kqmq5
I0114 01:50:12.160220   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966603-1693", Name:"frontend", UID:"ba99f22c-01ae-4994-8a7b-2892c4e2b430", APIVersion:"apps/v1", ResourceVersion:"2689", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-vf5zr
I0114 01:50:12.160256   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966603-1693", Name:"frontend", UID:"ba99f22c-01ae-4994-8a7b-2892c4e2b430", APIVersion:"apps/v1", ResourceVersion:"2689", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-58mlj
apps.sh:647: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Bhorizontalpodautoscaler.autoscaling/frontend autoscaled
E0114 01:50:12.363437   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:650: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 70
(BE0114 01:50:12.466767   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
horizontalpodautoscaler.autoscaling "frontend" deleted
E0114 01:50:12.575292   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
horizontalpodautoscaler.autoscaling/frontend autoscaled
E0114 01:50:12.687853   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:654: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(Bhorizontalpodautoscaler.autoscaling "frontend" deleted
Error: required flag(s) "max" not set
replicaset.apps "frontend" deleted
+++ exit code: 0
Recording: run_stateful_set_tests
Running command: run_stateful_set_tests

+++ Running case: test-cmd.run_stateful_set_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_stateful_set_tests
+++ [0114 01:50:13] Creating namespace namespace-1578966613-27365
namespace/namespace-1578966613-27365 created
Context "test" modified.
+++ [0114 01:50:13] Testing kubectl(v1:statefulsets)
apps.sh:470: Successful get statefulset {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 01:50:13.364588   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:13.467963   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 01:50:13.511141   50985 controller.go:606] quota admission added evaluator for: statefulsets.apps
statefulset.apps/nginx created
E0114 01:50:13.576489   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:476: Successful get statefulset nginx {{.spec.replicas}}: 0
(BE0114 01:50:13.689092   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:477: Successful get statefulset nginx {{.status.observedGeneration}}: 1
(Bstatefulset.apps/nginx scaled
I0114 01:50:13.814438   54459 event.go:278] Event(v1.ObjectReference{Kind:"StatefulSet", Namespace:"namespace-1578966613-27365", Name:"nginx", UID:"d06f7c6c-5d42-4986-b9f9-5cd74e9b7dc0", APIVersion:"apps/v1", ResourceVersion:"2716", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' create Pod nginx-0 in StatefulSet nginx successful
apps.sh:481: Successful get statefulset nginx {{.spec.replicas}}: 1
(Bapps.sh:482: Successful get statefulset nginx {{.status.observedGeneration}}: 2
(Bstatefulset.apps/nginx restarted
apps.sh:490: Successful get statefulset nginx {{.status.observedGeneration}}: 3
(BE0114 01:50:14.365712   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps "nginx" deleted
I0114 01:50:14.389194   54459 stateful_set.go:420] StatefulSet has been deleted namespace-1578966613-27365/nginx
E0114 01:50:14.469085   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_statefulset_history_tests
Running command: run_statefulset_history_tests

+++ Running case: test-cmd.run_statefulset_history_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
E0114 01:50:14.577766   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ command: run_statefulset_history_tests
+++ [0114 01:50:14] Creating namespace namespace-1578966614-22405
namespace/namespace-1578966614-22405 created
E0114 01:50:14.690228   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0114 01:50:14] Testing kubectl(v1:statefulsets, v1:controllerrevisions)
apps.sh:418: Successful get statefulset {{range.items}}{{.metadata.name}}:{{end}}: 
(Bstatefulset.apps/nginx created
apps.sh:422: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"StatefulSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"app":"nginx-statefulset"},"name":"nginx","namespace":"namespace-1578966614-22405"},"spec":{"replicas":0,"selector":{"matchLabels":{"app":"nginx-statefulset"}},"serviceName":"nginx","template":{"metadata":{"labels":{"app":"nginx-statefulset"}},"spec":{"containers":[{"command":["sh","-c","while true; do sleep 1; done"],"image":"k8s.gcr.io/nginx-slim:0.7","name":"nginx","ports":[{"containerPort":80,"name":"web"}]}],"terminationGracePeriodSeconds":5}},"updateStrategy":{"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
(Bstatefulset.apps/nginx skipped rollback (current template already matches revision 1)
apps.sh:425: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(BE0114 01:50:15.366988   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:426: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BE0114 01:50:15.470318   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:15.579191   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps/nginx configured
E0114 01:50:15.691566   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:429: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
(Bapps.sh:430: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:431: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bapps.sh:432: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"StatefulSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"app":"nginx-statefulset"},"name":"nginx","namespace":"namespace-1578966614-22405"},"spec":{"replicas":0,"selector":{"matchLabels":{"app":"nginx-statefulset"}},"serviceName":"nginx","template":{"metadata":{"labels":{"app":"nginx-statefulset"}},"spec":{"containers":[{"command":["sh","-c","while true; do sleep 1; done"],"image":"k8s.gcr.io/nginx-slim:0.7","name":"nginx","ports":[{"containerPort":80,"name":"web"}]}],"terminationGracePeriodSeconds":5}},"updateStrategy":{"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"StatefulSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-statefulset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"app":"nginx-statefulset"},"name":"nginx","namespace":"namespace-1578966614-22405"},"spec":{"replicas":0,"selector":{"matchLabels":{"app":"nginx-statefulset"}},"serviceName":"nginx","template":{"metadata":{"labels":{"app":"nginx-statefulset"}},"spec":{"containers":[{"command":["sh","-c","while true; do sleep 1; done"],"image":"k8s.gcr.io/nginx-slim:0.8","name":"nginx","ports":[{"containerPort":80,"name":"web"}]},{"image":"k8s.gcr.io/pause:2.0","name":"pause","ports":[{"containerPort":81,"name":"web-2"}]}],"terminationGracePeriodSeconds":5}},"updateStrategy":{"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-statefulset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
... skipping 11 lines ...
    Environment:	<none>
    Mounts:	<none>
  Volumes:	<none>
 (dry run)
apps.sh:435: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
(Bapps.sh:436: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(BE0114 01:50:16.368121   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:437: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(BE0114 01:50:16.471576   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps/nginx rolled back
E0114 01:50:16.580154   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:440: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(BE0114 01:50:16.692825   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:441: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BSuccessful
message:error: unable to find specified revision 1000000 in history
has:unable to find specified revision
apps.sh:445: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(Bapps.sh:446: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(Bstatefulset.apps/nginx rolled back
E0114 01:50:17.369396   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:449: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
(BE0114 01:50:17.472886   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:450: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(BE0114 01:50:17.581565   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:451: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(BE0114 01:50:17.694413   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps "nginx" deleted
I0114 01:50:17.718973   54459 stateful_set.go:420] StatefulSet has been deleted namespace-1578966614-22405/nginx
+++ exit code: 0
Recording: run_lists_tests
Running command: run_lists_tests

... skipping 5 lines ...
Context "test" modified.
+++ [0114 01:50:18] Testing kubectl(v1:lists)
service/list-service-test created
deployment.apps/list-deployment-test created
I0114 01:50:18.339489   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966617-30462", Name:"list-deployment-test", UID:"4803ac53-72fa-4498-98b8-11c64f39db95", APIVersion:"apps/v1", ResourceVersion:"2754", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set list-deployment-test-7cd8c5ff6d to 1
I0114 01:50:18.345882   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966617-30462", Name:"list-deployment-test-7cd8c5ff6d", UID:"0ce95af6-7deb-410e-b385-096963a7e7d8", APIVersion:"apps/v1", ResourceVersion:"2755", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: list-deployment-test-7cd8c5ff6d-ctwnz
E0114 01:50:18.370485   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "list-service-test" deleted
deployment.apps "list-deployment-test" deleted
+++ exit code: 0
E0114 01:50:18.474139   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_multi_resources_tests
Running command: run_multi_resources_tests

+++ Running case: test-cmd.run_multi_resources_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_multi_resources_tests
+++ [0114 01:50:18] Creating namespace namespace-1578966618-6756
E0114 01:50:18.583028   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578966618-6756 created
E0114 01:50:18.695639   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0114 01:50:18] Testing kubectl(v1:multiple resources)
Testing with file hack/testdata/multi-resource-yaml.yaml and replace with file hack/testdata/multi-resource-yaml-modify.yaml
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Bservice/mock created
replicationcontroller/mock created
I0114 01:50:19.132740   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966618-6756", Name:"mock", UID:"97076401-763c-4315-8522-64ebd6ecfc4a", APIVersion:"v1", ResourceVersion:"2778", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-t2vzd
generic-resources.sh:72: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(Bgeneric-resources.sh:80: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BE0114 01:50:19.371761   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
NAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
service/mock   ClusterIP   10.0.0.165   <none>        99/TCP    0s

NAME                         DESIRED   CURRENT   READY   AGE
replicationcontroller/mock   1         1         0       0s
E0114 01:50:19.475449   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:19.584685   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Name:              mock
Namespace:         namespace-1578966618-6756
Labels:            app=mock
Annotations:       <none>
Selector:          app=mock
Type:              ClusterIP
... skipping 8 lines ...
Name:         mock
Namespace:    namespace-1578966618-6756
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 2 lines ...
    Mounts:       <none>
  Volumes:        <none>
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: mock-t2vzd
E0114 01:50:19.696846   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock" deleted
replicationcontroller "mock" deleted
service/mock replaced
replicationcontroller/mock replaced
I0114 01:50:19.868363   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966618-6756", Name:"mock", UID:"f7aa6bdc-872d-48c7-8c99-dec633269c27", APIVersion:"v1", ResourceVersion:"2793", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-xcrcb
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(Bgeneric-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(Bservice/mock edited
replicationcontroller/mock edited
E0114 01:50:20.372924   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(BE0114 01:50:20.476779   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(BE0114 01:50:20.586081   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock labeled
replicationcontroller/mock labeled
E0114 01:50:20.698152   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(Bgeneric-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(Bservice/mock annotated
replicationcontroller/mock annotated
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(Bgeneric-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(Bservice "mock" deleted
replicationcontroller "mock" deleted
Testing with file hack/testdata/multi-resource-list.json and replace with file hack/testdata/multi-resource-list-modify.json
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 01:50:21.374194   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 01:50:21.478022   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:21.587646   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock created
replicationcontroller/mock created
I0114 01:50:21.663002   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966618-6756", Name:"mock", UID:"06b72021-42ad-465c-963b-e274f47e471d", APIVersion:"v1", ResourceVersion:"2818", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-ddvd4
E0114 01:50:21.699569   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:72: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(Bgeneric-resources.sh:80: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BNAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
service/mock   ClusterIP   10.0.0.50    <none>        99/TCP    0s

NAME                         DESIRED   CURRENT   READY   AGE
... skipping 15 lines ...
Name:         mock
Namespace:    namespace-1578966618-6756
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 2 lines ...
    Mounts:       <none>
  Volumes:        <none>
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: mock-ddvd4
E0114 01:50:22.375350   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock" deleted
replicationcontroller "mock" deleted
service/mock replaced
replicationcontroller/mock replaced
I0114 01:50:22.414122   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966618-6756", Name:"mock", UID:"34bd7efc-08b0-4011-9ce2-beaae2225f43", APIVersion:"v1", ResourceVersion:"2832", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-b89qw
E0114 01:50:22.479357   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(BE0114 01:50:22.588946   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(BE0114 01:50:22.700705   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock edited
replicationcontroller/mock edited
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(Bservice/mock labeled
replicationcontroller/mock labeled
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(Bgeneric-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(BE0114 01:50:23.376643   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock annotated
replicationcontroller/mock annotated
E0114 01:50:23.480707   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(BE0114 01:50:23.590486   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(BE0114 01:50:23.702120   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock" deleted
replicationcontroller "mock" deleted
Testing with file hack/testdata/multi-resource-json.json and replace with file hack/testdata/multi-resource-json-modify.json
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Bservice/mock created
replicationcontroller/mock created
I0114 01:50:24.128881   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966618-6756", Name:"mock", UID:"1007bb53-80c9-46cd-8d8b-7a9a2457efc0", APIVersion:"v1", ResourceVersion:"2856", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-m8bfw
generic-resources.sh:72: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(Bgeneric-resources.sh:80: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BE0114 01:50:24.377833   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
NAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
service/mock   ClusterIP   10.0.0.141   <none>        99/TCP    0s

NAME                         DESIRED   CURRENT   READY   AGE
replicationcontroller/mock   1         1         0       0s
E0114 01:50:24.481921   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:24.591648   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Name:              mock
Namespace:         namespace-1578966618-6756
Labels:            app=mock
Annotations:       <none>
Selector:          app=mock
Type:              ClusterIP
... skipping 8 lines ...
Name:         mock
Namespace:    namespace-1578966618-6756
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 2 lines ...
    Mounts:       <none>
  Volumes:        <none>
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: mock-m8bfw
E0114 01:50:24.703296   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock" deleted
replicationcontroller "mock" deleted
service/mock replaced
replicationcontroller/mock replaced
I0114 01:50:24.828489   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966618-6756", Name:"mock", UID:"1dcfc394-e35b-43f2-9684-2c90e98c86e8", APIVersion:"v1", ResourceVersion:"2871", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-stzcn
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(Bgeneric-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(Bservice/mock edited
replicationcontroller/mock edited
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(BE0114 01:50:25.379117   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(BE0114 01:50:25.483419   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock labeled
replicationcontroller/mock labeled
E0114 01:50:25.593163   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(BE0114 01:50:25.704356   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(Bservice/mock annotated
replicationcontroller/mock annotated
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(Bgeneric-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(Bservice "mock" deleted
replicationcontroller "mock" deleted
Testing with file hack/testdata/multi-resource-rclist.json and replace with file hack/testdata/multi-resource-rclist-modify.json
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 01:50:26.380562   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:26.484627   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/mock created
I0114 01:50:26.526567   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966618-6756", Name:"mock", UID:"80d711a9-a4a2-43c3-b6c4-204735c19ab7", APIVersion:"v1", ResourceVersion:"2892", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-thr6r
replicationcontroller/mock2 created
I0114 01:50:26.531890   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966618-6756", Name:"mock2", UID:"130ff097-c9bf-4bbf-81c6-b809c8c01e75", APIVersion:"v1", ResourceVersion:"2894", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock2-kxjzc
E0114 01:50:26.594569   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:78: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:mock2:
(BE0114 01:50:26.705507   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
NAME    DESIRED   CURRENT   READY   AGE
mock    1         1         0       0s
mock2   1         1         0       0s
Name:         mock
Namespace:    namespace-1578966618-6756
Selector:     app=mock
Labels:       app=mock
              status=replaced
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 11 lines ...
Namespace:    namespace-1578966618-6756
Selector:     app=mock2
Labels:       app=mock2
              status=replaced
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock2
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 10 lines ...
replicationcontroller/mock replaced
I0114 01:50:27.183226   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966618-6756", Name:"mock", UID:"86ab3c38-9dbf-4a3c-9a8b-c2f4b4376244", APIVersion:"v1", ResourceVersion:"2910", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-4zx2k
replicationcontroller/mock2 replaced
I0114 01:50:27.188599   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966618-6756", Name:"mock2", UID:"8fd9408e-c50f-486a-a9a3-51c69ad84986", APIVersion:"v1", ResourceVersion:"2912", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock2-fq2l5
generic-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(BI0114 01:50:27.345147   54459 horizontal.go:353] Horizontal Pod Autoscaler frontend has been deleted in namespace-1578966603-1693
E0114 01:50:27.381794   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:104: Successful get rc mock2 {{.metadata.labels.status}}: replaced
(BE0114 01:50:27.485935   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:27.595897   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/mock edited
replicationcontroller/mock2 edited
E0114 01:50:27.706910   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:122: Successful get rc mock2 {{.metadata.labels.status}}: edited
(Breplicationcontroller/mock labeled
replicationcontroller/mock2 labeled
generic-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(Bgeneric-resources.sh:142: Successful get rc mock2 {{.metadata.labels.labeled}}: true
(Breplicationcontroller/mock annotated
replicationcontroller/mock2 annotated
E0114 01:50:28.383152   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(BE0114 01:50:28.487210   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:161: Successful get rc mock2 {{.metadata.annotations.annotated}}: true
(Breplicationcontroller "mock" deleted
E0114 01:50:28.597052   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller "mock2" deleted
Testing with file hack/testdata/multi-resource-svclist.json and replace with file hack/testdata/multi-resource-svclist-modify.json
E0114 01:50:28.707989   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Bservice/mock created
service/mock2 created
generic-resources.sh:70: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:mock2:
(BNAME    TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
... skipping 22 lines ...
IP:                10.0.0.245
Port:              <unset>  99/TCP
TargetPort:        9949/TCP
Endpoints:         <none>
Session Affinity:  None
Events:            <none>
E0114 01:50:29.384198   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:29.488527   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock" deleted
service "mock2" deleted
service/mock replaced
service/mock2 replaced
E0114 01:50:29.597913   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(BE0114 01:50:29.709183   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:98: Successful get services mock2 {{.metadata.labels.status}}: replaced
(Bservice/mock edited
service/mock2 edited
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:116: Successful get services mock2 {{.metadata.labels.status}}: edited
(Bservice/mock labeled
service/mock2 labeled
E0114 01:50:30.385453   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(BE0114 01:50:30.489928   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:136: Successful get services mock2 {{.metadata.labels.labeled}}: true
(BE0114 01:50:30.599793   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock annotated
service/mock2 annotated
E0114 01:50:30.710346   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(Bgeneric-resources.sh:155: Successful get services mock2 {{.metadata.annotations.annotated}}: true
(Bservice "mock" deleted
service "mock2" deleted
generic-resources.sh:173: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:174: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 01:50:31.386963   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:31.491325   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:31.601033   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock created
replicationcontroller/mock created
I0114 01:50:31.642787   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966618-6756", Name:"mock", UID:"adddeb9b-040b-4a17-afac-9d540e00fe18", APIVersion:"v1", ResourceVersion:"2974", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-p9txr
E0114 01:50:31.711498   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:180: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(Bgeneric-resources.sh:181: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(Bservice "mock" deleted
replicationcontroller "mock" deleted
generic-resources.sh:187: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:188: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
... skipping 2 lines ...
Running command: run_persistent_volumes_tests

+++ Running case: test-cmd.run_persistent_volumes_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_persistent_volumes_tests
+++ [0114 01:50:32] Creating namespace namespace-1578966632-544
E0114 01:50:32.388057   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578966632-544 created
E0114 01:50:32.492577   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0114 01:50:32] Testing persistent volumes
E0114 01:50:32.602395   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:30: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 01:50:32.712672   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolume/pv0001 created
storage.sh:33: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0001:
(Bpersistentvolume "pv0001" deleted
persistentvolume/pv0002 created
storage.sh:36: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0002:
(BE0114 01:50:33.389117   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolume "pv0002" deleted
E0114 01:50:33.493664   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolume/pv0003 created
E0114 01:50:33.603422   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:39: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0003:
(BE0114 01:50:33.713775   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolume "pv0003" deleted
storage.sh:42: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpersistentvolume/pv0001 created
storage.sh:45: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0001:
(BSuccessful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
... skipping 2 lines ...
Successful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
persistentvolume "pv0001" deleted
has:persistentvolume "pv0001" deleted
storage.sh:49: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
E0114 01:50:34.390367   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_persistent_volume_claims_tests
Running command: run_persistent_volume_claims_tests

+++ Running case: test-cmd.run_persistent_volume_claims_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_persistent_volume_claims_tests
+++ [0114 01:50:34] Creating namespace namespace-1578966634-2806
E0114 01:50:34.494954   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578966634-2806 created
E0114 01:50:34.604640   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0114 01:50:34] Testing persistent volumes claims
E0114 01:50:34.715002   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:64: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpersistentvolumeclaim/myclaim-1 created
I0114 01:50:34.906235   54459 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578966634-2806", Name:"myclaim-1", UID:"b7135060-2274-4a4c-9677-2df94d9f6596", APIVersion:"v1", ResourceVersion:"3011", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
I0114 01:50:34.910093   54459 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578966634-2806", Name:"myclaim-1", UID:"b7135060-2274-4a4c-9677-2df94d9f6596", APIVersion:"v1", ResourceVersion:"3013", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
storage.sh:67: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-1:
(Bpersistentvolumeclaim "myclaim-1" deleted
I0114 01:50:35.101926   54459 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578966634-2806", Name:"myclaim-1", UID:"b7135060-2274-4a4c-9677-2df94d9f6596", APIVersion:"v1", ResourceVersion:"3015", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
persistentvolumeclaim/myclaim-2 created
I0114 01:50:35.294085   54459 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578966634-2806", Name:"myclaim-2", UID:"3fef16d8-5179-4f47-90b9-765744f14a66", APIVersion:"v1", ResourceVersion:"3020", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
I0114 01:50:35.297358   54459 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578966634-2806", Name:"myclaim-2", UID:"3fef16d8-5179-4f47-90b9-765744f14a66", APIVersion:"v1", ResourceVersion:"3022", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
E0114 01:50:35.391397   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:71: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-2:
(Bpersistentvolumeclaim "myclaim-2" deleted
I0114 01:50:35.481298   54459 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578966634-2806", Name:"myclaim-2", UID:"3fef16d8-5179-4f47-90b9-765744f14a66", APIVersion:"v1", ResourceVersion:"3024", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
E0114 01:50:35.495970   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:35.606372   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolumeclaim/myclaim-3 created
I0114 01:50:35.676491   54459 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578966634-2806", Name:"myclaim-3", UID:"cabe82f5-7a23-49cc-a548-8a50a08f093e", APIVersion:"v1", ResourceVersion:"3027", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
I0114 01:50:35.679651   54459 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578966634-2806", Name:"myclaim-3", UID:"cabe82f5-7a23-49cc-a548-8a50a08f093e", APIVersion:"v1", ResourceVersion:"3029", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
E0114 01:50:35.716201   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:75: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-3:
(Bpersistentvolumeclaim "myclaim-3" deleted
I0114 01:50:35.866034   54459 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1578966634-2806", Name:"myclaim-3", UID:"cabe82f5-7a23-49cc-a548-8a50a08f093e", APIVersion:"v1", ResourceVersion:"3031", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
storage.sh:78: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
Recording: run_storage_class_tests
... skipping 2 lines ...
+++ Running case: test-cmd.run_storage_class_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_storage_class_tests
+++ [0114 01:50:36] Testing storage class
storage.sh:92: Successful get storageclass {{range.items}}{{.metadata.name}}:{{end}}: 
(Bstorageclass.storage.k8s.io/storage-class-name created
E0114 01:50:36.392679   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:108: Successful get storageclass {{range.items}}{{.metadata.name}}:{{end}}: storage-class-name:
(BE0114 01:50:36.497241   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:109: Successful get sc {{range.items}}{{.metadata.name}}:{{end}}: storage-class-name:
(BE0114 01:50:36.607704   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storageclass.storage.k8s.io "storage-class-name" deleted
E0114 01:50:36.717317   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:112: Successful get storageclass {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
Recording: run_nodes_tests
Running command: run_nodes_tests

+++ Running case: test-cmd.run_nodes_tests 
... skipping 144 lines ...
  Resource           Requests  Limits
  --------           --------  ------
  cpu                0 (0%)    0 (0%)
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
(B
E0114 01:50:37.393841   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:37.498241   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1383: Successful describe
Name:               127.0.0.1
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Tue, 14 Jan 2020 01:46:10 +0000
... skipping 35 lines ...
  --------           --------  ------
  cpu                0 (0%)    0 (0%)
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
Events:              <none>
(B
E0114 01:50:37.608739   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Labels:
matched CreationTimestamp:
matched Conditions:
matched Addresses:
matched Capacity:
... skipping 41 lines ...
  Resource           Requests  Limits
  --------           --------  ------
  cpu                0 (0%)    0 (0%)
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
Events:              <none>
(BE0114 01:50:37.718695   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:               127.0.0.1
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Tue, 14 Jan 2020 01:46:10 +0000
... skipping 128 lines ...
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
Events:              <none>
(Bcore.sh:1395: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode/127.0.0.1 patched
core.sh:1398: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: true
(BE0114 01:50:38.395296   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 patched
E0114 01:50:38.499566   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1401: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE0114 01:50:38.609964   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
tokenreview.authentication.k8s.io/<unknown> created
tokenreview.authentication.k8s.io/<unknown> created
E0114 01:50:38.719887   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_authorization_tests
Running command: run_authorization_tests

+++ Running case: test-cmd.run_authorization_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
... skipping 57 lines ...
Successful
message:yes
has:yes
Successful
message:yes
has:yes
E0114 01:50:39.396589   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:39.500801   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Warning: the server doesn't have a resource type 'invalid_resource'
yes
has:the server doesn't have a resource type
E0114 01:50:39.611260   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:yes
has:yes
E0114 01:50:39.721199   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: --subresource can not be used with NonResourceURL
has:subresource can not be used with NonResourceURL
Successful
Successful
message:yes
0
has:0
... skipping 8 lines ...
yes
has:Warning: the server doesn't have a resource type 'foo'
Successful
message:Warning: the server doesn't have a resource type 'foo'
yes
has not:Warning: resource 'foo' is not namespace scoped
E0114 01:50:40.397903   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:yes
has not:Warning
E0114 01:50:40.502098   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Warning: resource 'nodes' is not namespace scoped
yes
has:Warning: resource 'nodes' is not namespace scoped
E0114 01:50:40.612504   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:yes
has not:Warning: resource 'nodes' is not namespace scoped
E0114 01:50:40.722792   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
clusterrole.rbac.authorization.k8s.io/testing-CR reconciled
	reconciliation required create
	missing rules added:
		{Verbs:[create delete deletecollection get list patch update watch] APIGroups:[] Resources:[pods] ResourceNames:[] NonResourceURLs:[]}
clusterrolebinding.rbac.authorization.k8s.io/testing-CRB reconciled
	reconciliation required create
... skipping 9 lines ...
		{Verbs:[get list watch] APIGroups:[] Resources:[configmaps] ResourceNames:[] NonResourceURLs:[]}
legacy-script.sh:821: Successful get rolebindings -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-RB:
(Blegacy-script.sh:822: Successful get roles -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-R:
(Blegacy-script.sh:823: Successful get clusterrolebindings -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CRB:
(Blegacy-script.sh:824: Successful get clusterroles -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CR:
(BSuccessful
message:error: only rbac.authorization.k8s.io/v1 is supported: not *v1beta1.ClusterRole
has:only rbac.authorization.k8s.io/v1 is supported
rolebinding.rbac.authorization.k8s.io "testing-RB" deleted
role.rbac.authorization.k8s.io "testing-R" deleted
warning: deleting cluster-scoped resources, not scoped to the provided namespace
clusterrole.rbac.authorization.k8s.io "testing-CR" deleted
clusterrolebinding.rbac.authorization.k8s.io "testing-CRB" deleted
E0114 01:50:41.399026   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_retrieve_multiple_tests
Running command: run_retrieve_multiple_tests

+++ Running case: test-cmd.run_retrieve_multiple_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_retrieve_multiple_tests
E0114 01:50:41.503516   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0114 01:50:41] Testing kubectl(v1:multiget)
E0114 01:50:41.613847   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:242: Successful get nodes/127.0.0.1 service/kubernetes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:kubernetes:
(B+++ exit code: 0
Recording: run_resource_aliasing_tests
Running command: run_resource_aliasing_tests

+++ Running case: test-cmd.run_resource_aliasing_tests 
E0114 01:50:41.723992   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_resource_aliasing_tests
+++ [0114 01:50:41] Creating namespace namespace-1578966641-16721
namespace/namespace-1578966641-16721 created
Context "test" modified.
+++ [0114 01:50:41] Testing resource aliasing
replicationcontroller/cassandra created
I0114 01:50:42.092057   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966641-16721", Name:"cassandra", UID:"6bf08167-64ec-4740-8fc4-3ed0c7a1e3ca", APIVersion:"v1", ResourceVersion:"3059", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-9czcn
I0114 01:50:42.094489   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966641-16721", Name:"cassandra", UID:"6bf08167-64ec-4740-8fc4-3ed0c7a1e3ca", APIVersion:"v1", ResourceVersion:"3059", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-vfg28
service/cassandra created
E0114 01:50:42.400102   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Waiting for Get all -l'app=cassandra' {{range.items}}{{range .metadata.labels}}{{.}}:{{end}}{{end}} : expected: cassandra:cassandra:cassandra:cassandra::, got: cassandra:cassandra:cassandra:cassandra:

discovery.sh:91: FAIL!
Get all -l'app=cassandra' {{range.items}}{{range .metadata.labels}}{{.}}:{{end}}{{end}}
  Expected: cassandra:cassandra:cassandra:cassandra::
  Got:      cassandra:cassandra:cassandra:cassandra:
(B
55 /home/prow/go/src/k8s.io/kubernetes/hack/lib/test.sh
(B
E0114 01:50:42.504855   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
discovery.sh:92: Successful get all -l'app=cassandra' {{range.items}}{{range .metadata.labels}}{{.}}:{{end}}{{end}}: cassandra:cassandra:cassandra:cassandra:
(BE0114 01:50:42.615282   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod "cassandra-9czcn" deleted
I0114 01:50:42.666964   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966641-16721", Name:"cassandra", UID:"6bf08167-64ec-4740-8fc4-3ed0c7a1e3ca", APIVersion:"v1", ResourceVersion:"3065", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-tbkhd
pod "cassandra-vfg28" deleted
I0114 01:50:42.684677   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966641-16721", Name:"cassandra", UID:"6bf08167-64ec-4740-8fc4-3ed0c7a1e3ca", APIVersion:"v1", ResourceVersion:"3076", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-xtb5g
replicationcontroller "cassandra" deleted
service "cassandra" deleted
E0114 01:50:42.724662   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_kubectl_explain_tests
Running command: run_kubectl_explain_tests

+++ Running case: test-cmd.run_kubectl_explain_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
... skipping 70 lines ...

FIELD:    message <string>

DESCRIPTION:
     A human readable message indicating details about why the pod is in this
     condition.
E0114 01:50:43.401288   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:43.505988   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
KIND:     CronJob
VERSION:  batch/v1beta1

DESCRIPTION:
     CronJob represents the configuration of a single cron job.

... skipping 21 lines ...

   status	<Object>
     Current status of a cron job. More info:
     https://git.k8s.io/community/contributors/devel/sig-architecture/api-conventions.md#spec-and-status

+++ exit code: 0
E0114 01:50:43.616469   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_swagger_tests
Running command: run_swagger_tests

+++ Running case: test-cmd.run_swagger_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_swagger_tests
+++ [0114 01:50:43] Testing swagger
E0114 01:50:43.725881   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_kubectl_sort_by_tests
Running command: run_kubectl_sort_by_tests

+++ Running case: test-cmd.run_kubectl_sort_by_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_sort_by_tests
+++ [0114 01:50:43] Testing kubectl --sort-by
get.sh:256: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BNo resources found in namespace-1578966641-16721 namespace.
No resources found in namespace-1578966641-16721 namespace.
get.sh:264: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 01:50:44.402703   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/valid-pod created
E0114 01:50:44.507323   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:268: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BE0114 01:50:44.617668   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          0s
has:valid-pod
E0114 01:50:44.727302   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:I0114 01:50:44.738572   86365 loader.go:375] Config loaded from file:  /tmp/tmp.PaCMxnunqg/.kube/config
I0114 01:50:44.748479   86365 round_trippers.go:420] GET http://localhost:8080/api/v1/namespaces/namespace-1578966641-16721/pods?includeObject=Object
I0114 01:50:44.748506   86365 round_trippers.go:427] Request Headers:
I0114 01:50:44.748513   86365 round_trippers.go:431]     Accept: application/json;as=Table;v=v1;g=meta.k8s.io,application/json;as=Table;v=v1beta1;g=meta.k8s.io,application/json
I0114 01:50:44.748517   86365 round_trippers.go:431]     User-Agent: kubectl/v1.18.0 (linux/amd64) kubernetes/b008eda
... skipping 24 lines ...
get.sh:279: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
get.sh:283: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bget.sh:288: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/sorted-pod1 created
E0114 01:50:45.403980   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:292: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:
(BE0114 01:50:45.508529   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:45.619125   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/sorted-pod2 created
E0114 01:50:45.728481   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:296: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:sorted-pod2:
(Bpod/sorted-pod3 created
get.sh:300: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:sorted-pod2:sorted-pod3:
(BSuccessful
message:sorted-pod1:sorted-pod2:sorted-pod3:
has:sorted-pod1:sorted-pod2:sorted-pod3:
Successful
message:sorted-pod3:sorted-pod2:sorted-pod1:
has:sorted-pod3:sorted-pod2:sorted-pod1:
Successful
message:sorted-pod2:sorted-pod1:sorted-pod3:
has:sorted-pod2:sorted-pod1:sorted-pod3:
E0114 01:50:46.405280   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:sorted-pod1:sorted-pod2:sorted-pod3:
has:sorted-pod1:sorted-pod2:sorted-pod3:
E0114 01:50:46.509777   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:I0114:I0114:I0114:I0114:I0114:I0114:I0114:I0114:I0114:I0114:NAME:sorted-pod2:sorted-pod1:sorted-pod3:
has:sorted-pod2:sorted-pod1:sorted-pod3:
Successful
message:I0114 01:50:46.545093   86627 loader.go:375] Config loaded from file:  /tmp/tmp.PaCMxnunqg/.kube/config
I0114 01:50:46.555361   86627 round_trippers.go:420] GET http://localhost:8080/api/v1/namespaces/namespace-1578966641-16721/pods
... skipping 8 lines ...
I0114 01:50:46.558322   86627 request.go:1022] Response Body: {"kind":"PodList","apiVersion":"v1","metadata":{"selfLink":"/api/v1/namespaces/namespace-1578966641-16721/pods","resourceVersion":"3097"},"items":[{"metadata":{"name":"sorted-pod1","namespace":"namespace-1578966641-16721","selfLink":"/api/v1/namespaces/namespace-1578966641-16721/pods/sorted-pod1","uid":"813e3dab-6f0c-4404-af5d-59a9d6d91d14","resourceVersion":"3095","creationTimestamp":"2020-01-14T01:50:45Z","labels":{"name":"sorted-pod3-label"}},"spec":{"containers":[{"name":"kubernetes-pause2","image":"k8s.gcr.io/pause:2.0","resources":{},"terminationMessagePath":"/dev/termination-log","terminationMessagePolicy":"File","imagePullPolicy":"IfNotPresent"}],"restartPolicy":"Always","terminationGracePeriodSeconds":30,"dnsPolicy":"ClusterFirst","securityContext":{},"schedulerName":"default-scheduler","priority":0,"enableServiceLinks":true},"status":{"phase":"Pending","qosClass":"BestEffort"}},{"metadata":{"name":"sorted-pod2","namespace":"namespace-1578966641-16721","selfLink":"/api/v1/namespaces/namespace-1578966 [truncated 1942 chars]
NAME          AGE
sorted-pod2   1s
sorted-pod1   1s
sorted-pod3   1s
has not:Table
E0114 01:50:46.620365   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:325: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:sorted-pod2:sorted-pod3:
(BE0114 01:50:46.729761   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "sorted-pod1" force deleted
pod "sorted-pod2" force deleted
pod "sorted-pod3" force deleted
get.sh:329: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
... skipping 3 lines ...
+++ Running case: test-cmd.run_kubectl_all_namespace_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_all_namespace_tests
+++ [0114 01:50:47] Testing kubectl --all-namespace
get.sh:342: Successful get namespaces {{range.items}}{{if eq .metadata.name \"default\"}}{{.metadata.name}}:{{end}}{{end}}: default:
(Bget.sh:346: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 01:50:47.406674   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/valid-pod created
E0114 01:50:47.511181   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:350: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BE0114 01:50:47.621814   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
NAMESPACE                    NAME        READY   STATUS    RESTARTS   AGE
namespace-1578966641-16721   valid-pod   0/1     Pending   0          0s
E0114 01:50:47.730977   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/all-ns-test-1 created
serviceaccount/test created
namespace/all-ns-test-2 created
serviceaccount/test created
Successful
message:NAMESPACE                    NAME      SECRETS   AGE
... skipping 117 lines ...
namespace-1578966632-544     default   0         16s
namespace-1578966634-2806    default   0         14s
namespace-1578966641-16721   default   0         7s
some-other-random            default   0         8s
has:all-ns-test-2
namespace "all-ns-test-1" deleted
E0114 01:50:48.407890   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:48.512464   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:48.623255   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:48.732304   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:49.409187   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:49.513718   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:49.624401   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:49.733531   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:50.410711   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:50.515373   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:50.625770   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:50.734882   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:51.412015   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:51.516660   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:51.627220   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:51.736126   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:52.413356   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:52.518069   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:52.628496   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:52.737377   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:53.414637   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:53.519515   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace "all-ns-test-2" deleted
E0114 01:50:53.630115   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:53.738909   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:54.415927   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:54.520908   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:54.631443   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:54.740166   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:55.417438   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:55.522188   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:55.632939   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:55.741629   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:56.419207   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:56.523611   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:56.635135   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:56.743145   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:57.420699   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:57.525215   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:57.636607   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:57.744413   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:58.422535   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 01:50:58.490471   54459 namespace_controller.go:185] Namespace has been deleted all-ns-test-1
E0114 01:50:58.526870   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:58.637828   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:50:58.745759   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:376: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
get.sh:380: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bget.sh:384: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(BSuccessful
... skipping 6 lines ...

+++ Running case: test-cmd.run_template_output_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_template_output_tests
+++ [0114 01:50:59] Testing --template support on commands
+++ [0114 01:50:59] Creating namespace namespace-1578966659-20658
E0114 01:50:59.423953   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578966659-20658 created
E0114 01:50:59.528291   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
E0114 01:50:59.639529   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
template-output.sh:29: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 01:50:59.747039   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/valid-pod created
{
    "apiVersion": "v1",
    "items": [
        {
            "apiVersion": "v1",
... skipping 56 lines ...
Successful
message:valid-pod:
has:valid-pod:
Successful
message:valid-pod:
has:valid-pod:
E0114 01:51:00.425284   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:51:00.529851   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:valid-pod:
has:valid-pod:
E0114 01:51:00.640908   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:51:00.748858   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:valid-pod:
has:valid-pod:
Successful
message:scale-1:
has:scale-1:
... skipping 9 lines ...
Successful
message:pi:
has:pi:
Successful
message:127.0.0.1:
has:127.0.0.1:
E0114 01:51:01.426854   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 untainted
E0114 01:51:01.531698   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/cassandra created
I0114 01:51:01.635183   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966659-20658", Name:"cassandra", UID:"10ec6cd9-e51d-4769-a8cf-692d53b13833", APIVersion:"v1", ResourceVersion:"3146", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-qlqrk
I0114 01:51:01.639052   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966659-20658", Name:"cassandra", UID:"10ec6cd9-e51d-4769-a8cf-692d53b13833", APIVersion:"v1", ResourceVersion:"3146", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-wnkl5
E0114 01:51:01.641758   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:cassandra:
has:cassandra:
E0114 01:51:01.750137   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	reconciliation required create
	missing rules added:
		{Verbs:[create delete deletecollection get list patch update watch] APIGroups:[] Resources:[pods] ResourceNames:[] NonResourceURLs:[]}
	reconciliation required create
	missing subjects added:
		{Kind:Group APIGroup:rbac.authorization.k8s.io Name:system:masters Namespace:}
... skipping 17 lines ...
has:cm:
I0114 01:51:02.288351   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966659-20658", Name:"deploy", UID:"bdb8d419-7865-4c2d-9774-a759a44ec6ed", APIVersion:"apps/v1", ResourceVersion:"3155", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set deploy-74bcc58696 to 1
I0114 01:51:02.292702   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966659-20658", Name:"deploy-74bcc58696", UID:"b6152726-68de-4ae3-b762-964379a7cdcf", APIVersion:"apps/v1", ResourceVersion:"3156", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: deploy-74bcc58696-ggj4t
Successful
message:deploy:
has:deploy:
E0114 01:51:02.428396   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
cronjob.batch/pi created
E0114 01:51:02.533839   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
E0114 01:51:02.643166   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:bar:
has:bar:
Successful
message:foo:
has:foo:
E0114 01:51:02.751755   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:myrole:
has:myrole:
Successful
message:foo:
has:foo:
... skipping 9 lines ...
Successful
message:valid-pod:
has:valid-pod:
Successful
message:valid-pod:
has:valid-pod:
E0114 01:51:03.429667   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:valid-pod:
has:valid-pod:
E0114 01:51:03.535233   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:kubernetes:
has:kubernetes:
E0114 01:51:03.644411   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:valid-pod:
has:valid-pod:
I0114 01:51:03.706846   54459 namespace_controller.go:185] Namespace has been deleted all-ns-test-2
E0114 01:51:03.752972   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
Successful
message:foo:
has:foo:
... skipping 12 lines ...
Successful
message:foo:
has:foo:
Successful
message:foo:
has:foo:
E0114 01:51:04.430902   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:apiVersion: v1
clusters:
- cluster:
    certificate-authority-data: DATA+OMITTED
    server: https://does-not-work
... skipping 6 lines ...
  name: test
current-context: test
kind: Config
preferences: {}
users: null
has:kind: Config
E0114 01:51:04.536179   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:deploy:
has:deploy:
E0114 01:51:04.645510   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:deploy:
has:deploy:
E0114 01:51:04.754001   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:deploy:
has:deploy:
Successful
message:deploy:
has:deploy:
... skipping 13 lines ...
pod "cassandra-wnkl5" deleted
I0114 01:51:05.261111   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1578966659-20658", Name:"cassandra", UID:"10ec6cd9-e51d-4769-a8cf-692d53b13833", APIVersion:"v1", ResourceVersion:"3152", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-9cshn
I0114 01:51:05.262558   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966659-20658", Name:"deploy-74bcc58696", UID:"b6152726-68de-4ae3-b762-964379a7cdcf", APIVersion:"apps/v1", ResourceVersion:"3163", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: deploy-74bcc58696-xrtj7
pod "deploy-74bcc58696-ggj4t" deleted
pod "valid-pod" deleted
replicationcontroller "cassandra" deleted
E0114 01:51:05.432225   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
clusterrole.rbac.authorization.k8s.io "myclusterrole" deleted
E0114 01:51:05.537564   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
clusterrolebinding.rbac.authorization.k8s.io "foo" deleted
E0114 01:51:05.646821   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "deploy" deleted
+++ exit code: 0
Recording: run_certificates_tests
Running command: run_certificates_tests

E0114 01:51:05.755791   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ Running case: test-cmd.run_certificates_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_certificates_tests
+++ [0114 01:51:05] Testing certificates
certificatesigningrequest.certificates.k8s.io/foo created
certificate.sh:29: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
... skipping 39 lines ...
    "metadata": {
        "resourceVersion": "",
        "selfLink": ""
    }
}
certificate.sh:32: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: Approved
(BE0114 01:51:06.433662   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io "foo" deleted
E0114 01:51:06.538988   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:34: Successful get csr {{range.items}}{{.metadata.name}}{{end}}: 
(BE0114 01:51:06.648117   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:51:06.757013   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo created
certificate.sh:37: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
(Bcertificatesigningrequest.certificates.k8s.io/foo approved
{
    "apiVersion": "v1",
    "items": [
... skipping 55 lines ...
        "selfLink": ""
    }
}
certificate.sh:40: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: Approved
(Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
certificate.sh:42: Successful get csr {{range.items}}{{.metadata.name}}{{end}}: 
(BE0114 01:51:07.434921   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:51:07.540207   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo created
E0114 01:51:07.649446   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:46: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
(BE0114 01:51:07.760994   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo denied
{
    "apiVersion": "v1",
    "items": [
        {
            "apiVersion": "certificates.k8s.io/v1beta1",
... skipping 36 lines ...
    }
}
certificate.sh:49: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: Denied
(Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
certificate.sh:51: Successful get csr {{range.items}}{{.metadata.name}}{{end}}: 
(Bcertificatesigningrequest.certificates.k8s.io/foo created
E0114 01:51:08.436327   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:54: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
(BE0114 01:51:08.541656   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo denied
E0114 01:51:08.650803   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
{
    "apiVersion": "v1",
    "items": [
        {
            "apiVersion": "certificates.k8s.io/v1beta1",
            "kind": "CertificateSigningRequest",
... skipping 31 lines ...
    "kind": "List",
    "metadata": {
        "resourceVersion": "",
        "selfLink": ""
    }
}
E0114 01:51:08.762207   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:57: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: Denied
(Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
certificate.sh:59: Successful get csr {{range.items}}{{.metadata.name}}{{end}}: 
(B+++ exit code: 0
Recording: run_cluster_management_tests
Running command: run_cluster_management_tests

+++ Running case: test-cmd.run_cluster_management_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_cluster_management_tests
+++ [0114 01:51:09] Testing cluster-management commands
node-management.sh:27: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(Bpod/test-pod-1 created
E0114 01:51:09.437465   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:51:09.543193   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/test-pod-2 created
E0114 01:51:09.652131   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:76: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: 
(BE0114 01:51:09.763379   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 tainted
node-management.sh:79: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: dedicated=foo:PreferNoSchedule
(Bnode/127.0.0.1 untainted
node-management.sh:83: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: 
(Bnode-management.sh:87: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode/127.0.0.1 cordoned (dry run)
node-management.sh:89: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE0114 01:51:10.438661   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:93: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode/127.0.0.1 cordoned (dry run)
node/127.0.0.1 drained (dry run)
E0114 01:51:10.544342   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:96: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(BE0114 01:51:10.653188   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:97: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE0114 01:51:10.764622   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:101: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode-management.sh:103: Successful get pods {{range .items}}{{.metadata.name}},{{end}}: test-pod-1,test-pod-2,
(Bnode/127.0.0.1 cordoned
node/127.0.0.1 drained
node-management.sh:106: Successful get pods/test-pod-2 {{.metadata.name}}: test-pod-2
(Bpod "test-pod-2" deleted
node/127.0.0.1 uncordoned
node-management.sh:111: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE0114 01:51:11.442110   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:115: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE0114 01:51:11.545575   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:node/127.0.0.1 already uncordoned (dry run)
has:already uncordoned
E0114 01:51:11.654581   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:119: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE0114 01:51:11.765872   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 labeled
node-management.sh:124: Successful get nodes 127.0.0.1 {{.metadata.labels.test}}: label
(BSuccessful
message:error: cannot specify both a node name and a --selector option
See 'kubectl drain -h' for help and examples
has:cannot specify both a node name
Successful
message:error: USAGE: cordon NODE [flags]
See 'kubectl cordon -h' for help and examples
has:error\: USAGE\: cordon NODE
node/127.0.0.1 already uncordoned
Successful
message:error: You must provide one or more resources by argument or filename.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
   '<resource> <name>'
   '<resource>'
has:must provide one or more resources
Successful
message:node/127.0.0.1 cordoned
has:node/127.0.0.1 cordoned
Successful
message:
has not:cordoned
E0114 01:51:12.443544   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:145: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: true
(B+++ exit code: 0
E0114 01:51:12.548571   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_plugins_tests
Running command: run_plugins_tests

+++ Running case: test-cmd.run_plugins_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_plugins_tests
+++ [0114 01:51:12] Testing kubectl plugins
E0114 01:51:12.655953   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/version/kubectl-version
  - warning: kubectl-version overwrites existing command: "kubectl version"

error: one plugin warning was found
has:kubectl-version overwrites existing command: "kubectl version"
E0114 01:51:12.767431   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/kubectl-foo
test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo
  - warning: test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin: test/fixtures/pkg/kubectl/plugins/kubectl-foo

error: one plugin warning was found
has:test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/kubectl-foo
has:plugins are available
Successful
message:Unable read directory "test/fixtures/pkg/kubectl/plugins/empty" from your PATH: open test/fixtures/pkg/kubectl/plugins/empty: no such file or directory. Skipping...
error: unable to find any kubectl plugins in your PATH
has:unable to find any kubectl plugins in your PATH
Successful
message:I am plugin foo
has:plugin foo
Successful
message:I am plugin bar called with args test/fixtures/pkg/kubectl/plugins/bar/kubectl-bar arg1
... skipping 10 lines ...

+++ Running case: test-cmd.run_impersonation_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_impersonation_tests
+++ [0114 01:51:13] Testing impersonation
Successful
message:error: requesting groups or user-extra for  without impersonating a user
has:without impersonating a user
E0114 01:51:13.444717   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:51:13.549780   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo created
E0114 01:51:13.657979   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
authorization.sh:68: Successful get csr/foo {{.spec.username}}: user1
(BE0114 01:51:13.768613   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
authorization.sh:69: Successful get csr/foo {{range .spec.groups}}{{.}}{{end}}: system:authenticated
(Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
certificatesigningrequest.certificates.k8s.io/foo created
authorization.sh:74: Successful get csr/foo {{len .spec.groups}}: 3
(Bauthorization.sh:75: Successful get csr/foo {{range .spec.groups}}{{.}} {{end}}: group2 group1 ,,,chameleon 
(Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
+++ exit code: 0
E0114 01:51:14.445938   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_wait_tests
Running command: run_wait_tests

+++ Running case: test-cmd.run_wait_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_wait_tests
+++ [0114 01:51:14] Testing kubectl wait
+++ [0114 01:51:14] Creating namespace namespace-1578966674-12464
E0114 01:51:14.551165   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1578966674-12464 created
E0114 01:51:14.659393   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
deployment.apps/test-1 created
E0114 01:51:14.769697   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 01:51:14.773273   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966674-12464", Name:"test-1", UID:"112d2a3e-b7d7-40fc-9bfa-abee6e503ee5", APIVersion:"apps/v1", ResourceVersion:"3248", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-1-6d98955cc9 to 1
I0114 01:51:14.779508   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966674-12464", Name:"test-1-6d98955cc9", UID:"d0c98ca0-972a-4a31-a211-e4a826f13ba3", APIVersion:"apps/v1", ResourceVersion:"3249", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-1-6d98955cc9-8kgkt
deployment.apps/test-2 created
I0114 01:51:14.866697   54459 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1578966674-12464", Name:"test-2", UID:"b43942de-f167-4da0-b558-20621ba0307d", APIVersion:"apps/v1", ResourceVersion:"3258", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-2-65897ff84d to 1
I0114 01:51:14.871499   54459 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1578966674-12464", Name:"test-2-65897ff84d", UID:"b8d1b765-3403-46ff-a022-a8a5a64da27f", APIVersion:"apps/v1", ResourceVersion:"3259", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-2-65897ff84d-7s5bp
wait.sh:36: Successful get deployments {{range .items}}{{.metadata.name}},{{end}}: test-1,test-2,
(BE0114 01:51:15.447240   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:51:15.552311   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:51:15.660636   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:51:15.771146   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:51:16.448515   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:51:16.553776   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:51:16.661943   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 01:51:16.772538   54459 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "test-1" deleted
deployment.apps "test-2" deleted
Successful
message:deployment.apps/test-1 condition met
deployment.apps/test-2 condition met
has:test-1 condition met
... skipping 27 lines ...
I0114 01:51:17.387616   50985 available_controller.go:398] Shutting down AvailableConditionController
I0114 01:51:17.388124   50985 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 01:51:17.388124   50985 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 01:51:17.388212   50985 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 01:51:17.388232   50985 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 01:51:17.388267   50985 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 01:51:17.388414   50985 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 01:51:17.388461   50985 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 01:51:17.388486   50985 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 01:51:17.388416   50985 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 01:51:17.388542   50985 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 01:51:17.388574   50985 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 01:51:17.388595   50985 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 01:51:17.388625   50985 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 01:51:17.388626   50985 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 01:51:17.388739   50985 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 01:51:17.388795   50985 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 01:51:17.388834   50985 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 01:51:17.388861   50985 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 01:51:17.388863   50985 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
E0114 01:51:17.388875   50985 controller.go:183] rpc error: code = Unavailable desc = transport is closing
junit report dir: /logs/artifacts
+++ [0114 01:51:17] Clean up complete
+ make test-integration
+++ [0114 01:51:21] Checking etcd is on PATH
/home/prow/go/src/k8s.io/kubernetes/third_party/etcd/etcd
+++ [0114 01:51:21] Starting etcd instance
... skipping 310 lines ...
    synthetic_master_test.go:721: UPDATE_NODE_APISERVER is not set

=== SKIP: test/integration/scheduler_perf TestSchedule100Node3KPods (0.00s)
    scheduler_test.go:73: Skipping because we want to run short tests


=== Failed
=== FAIL: test/integration/client TestDynamicClient (6.87s)
I0114 01:53:44.269198  106069 establishing_controller.go:85] Shutting down EstablishingController
I0114 01:53:45.242100  106069 serving.go:307] Generated self-signed cert (/tmp/kubernetes-kube-apiserver447156705/apiserver.crt, /tmp/kubernetes-kube-apiserver447156705/apiserver.key)
I0114 01:53:45.242135  106069 server.go:596] external host was not specified, using 127.0.0.1
W0114 01:53:45.242146  106069 authentication.go:439] AnonymousAuth is not allowed with the AlwaysAllow authorizer. Resetting AnonymousAuth to false. You should use a different authorizer
W0114 01:53:45.579199  106069 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 01:53:45.579230  106069 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
... skipping 159 lines ...
I0114 01:53:45.904745  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.904780  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.908202  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.908249  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 01:53:45.909636  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:45.909666  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
E0114 01:53:46.090492  106069 controller.go:183] an error on the server ("") has prevented the request from succeeding (get endpoints kubernetes)
W0114 01:53:46.157187  106069 genericapiserver.go:404] Skipping API discovery.k8s.io/v1alpha1 because it has no resources.
I0114 01:53:46.582451  106069 client.go:361] parsed scheme: "endpoint"
I0114 01:53:46.582539  106069 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W0114 01:53:46.695917  106069 genericapiserver.go:404] Skipping API apps/v1beta2 because it has no resources.
W0114 01:53:46.695945  106069 genericapiserver.go:404] Skipping API apps/v1beta1 because it has no resources.
I0114 01:53:46.714243  106069 plugins.go:158] Loaded 9 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,MutatingAdmissionWebhook,RuntimeClass.
... skipping 42 lines ...
E0114 01:53:49.960086  106069 controller.go:151] Unable to remove old endpoints from kubernetes service: StorageError: key not found, Code: 1, Key: /e0452299-09a2-454d-a942-b4a8d60613f1/registry/masterleases/127.0.0.1, ResourceVersion: 0, AdditionalErrorMsg: 
I0114 01:53:50.053870  106069 cache.go:39] Caches are synced for AvailableConditionController controller
I0114 01:53:50.054045  106069 cache.go:39] Caches are synced for autoregister controller
I0114 01:53:50.054092  106069 cache.go:39] Caches are synced for APIServiceRegistrationController controller
I0114 01:53:50.054781  106069 shared_informer.go:213] Caches are synced for cluster_authentication_trust_controller 
I0114 01:53:50.057101  106069 shared_informer.go:213] Caches are synced for crd-autoregister 
E0114 01:53:50.088859  106069 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0114 01:53:50.105868  106069 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0114 01:53:50.116031  106069 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0114 01:53:50.120535  106069 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0114 01:53:50.123036  106069 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
I0114 01:53:50.952571  106069 controller.go:107] OpenAPI AggregationController: Processing item 
I0114 01:53:50.952612  106069 controller.go:130] OpenAPI AggregationController: action for item : Nothing (removed from the queue).
I0114 01:53:50.952631  106069 controller.go:130] OpenAPI AggregationController: action for item k8s_internal_local_delegation_chain_0000000000: Nothing (removed from the queue).
I0114 01:53:50.962826  106069 storage_scheduling.go:133] created PriorityClass system-node-critical with value 2000001000
I0114 01:53:50.967546  106069 storage_scheduling.go:133] created PriorityClass system-cluster-critical with value 2000000000
I0114 01:53:50.967566  106069 storage_scheduling.go:142] all system priority classes are created successfully or already exist.
... skipping 21 lines ...
    testserver.go:198: Waiting for /healthz to be ok...
    dynamic_client_test.go:88: unexpected pod in list. wanted &v1.Pod{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"testjhlj7", GenerateName:"test", Namespace:"default", SelfLink:"/api/v1/namespaces/default/pods/testjhlj7", UID:"d51cbc72-a254-4809-affa-a1f7ca5b3cc8", ResourceVersion:"8161", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63714563631, loc:(*time.Location)(0x7541d00)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"client.test", Operation:"Update", APIVersion:"v1", Time:(*v1.Time)(0xc0491dc120), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc0491dc140)}}}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"test", Image:"test-image", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"Always", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc04d24d968), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc0482870e0), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(nil), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration{v1.Toleration{Key:"node.kubernetes.io/not-ready", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc04d24d990)}, v1.Toleration{Key:"node.kubernetes.io/unreachable", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc04d24d9b0)}}, HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(0xc04d24d9b8), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(0xc04d24d9bc), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}, Status:v1.PodStatus{Phase:"Pending", Conditions:[]v1.PodCondition(nil), Message:"", Reason:"", NominatedNodeName:"", HostIP:"", PodIP:"", PodIPs:[]v1.PodIP(nil), StartTime:(*v1.Time)(nil), InitContainerStatuses:[]v1.ContainerStatus(nil), ContainerStatuses:[]v1.ContainerStatus(nil), QOSClass:"BestEffort", EphemeralContainerStatuses:[]v1.ContainerStatus(nil)}}, got &v1.Pod{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"testjhlj7", GenerateName:"test", Namespace:"default", SelfLink:"/api/v1/namespaces/default/pods/testjhlj7", UID:"d51cbc72-a254-4809-affa-a1f7ca5b3cc8", ResourceVersion:"8161", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63714563631, loc:(*time.Location)(0x7541d00)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"client.test", Operation:"Update", APIVersion:"v1", Time:(*v1.Time)(0xc0491dcee0), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc0491dcec0)}}}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"test", Image:"test-image", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"Always", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc04f8ca4b8), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc0483337a0), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(nil), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration{v1.Toleration{Key:"node.kubernetes.io/not-ready", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc04f8ca500)}, v1.Toleration{Key:"node.kubernetes.io/unreachable", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc04f8ca520)}}, HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(0xc04f8ca498), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(0xc04f8ca479), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}, Status:v1.PodStatus{Phase:"Pending", Conditions:[]v1.PodCondition(nil), Message:"", Reason:"", NominatedNodeName:"", HostIP:"", PodIP:"", PodIPs:[]v1.PodIP(nil), StartTime:(*v1.Time)(nil), InitContainerStatuses:[]v1.ContainerStatus(nil), ContainerStatuses:[]v1.ContainerStatus(nil), QOSClass:"BestEffort", EphemeralContainerStatuses:[]v1.ContainerStatus(nil)}}


DONE 2486 tests, 4 skipped, 1 failure in 5.446s
+++ [0114 02:02:42] Saved JUnit XML test report to /logs/artifacts/junit_20200114-015127.xml
make[1]: *** [Makefile:185: test] Error 1
!!! [0114 02:02:42] Call tree:
!!! [0114 02:02:42]  1: hack/make-rules/test-integration.sh:97 runTests(...)
+++ [0114 02:02:42] Cleaning up etcd
+++ [0114 02:02:43] Integration test cleanup complete
make: *** [Makefile:204: test-integration] Error 1
+ EXIT_VALUE=2
+ set +o xtrace
Cleaning up after docker in docker.
================================================================================
[Barnacle] 2020/01/14 02:02:43 Cleaning up Docker data root...
[Barnacle] 2020/01/14 02:02:43 Removing all containers.
... skipping 12 lines ...