This job view page is being replaced by Spyglass soon. Check out the new job view.
ResultFAILURE
Tests 1 failed / 2610 succeeded
Started2020-01-14 15:50
Elapsed24m34s
Revisionmaster
links{u'resultstore': {u'url': u'https://source.cloud.google.com/results/invocations/755008b4-4673-4334-8a63-67b40df99fe7/targets/test'}}
resultstorehttps://source.cloud.google.com/results/invocations/755008b4-4673-4334-8a63-67b40df99fe7/targets/test

Test Failures


k8s.io/kubernetes/test/integration/client TestDynamicClient 6.89s

go test -v k8s.io/kubernetes/test/integration/client -run TestDynamicClient$
=== RUN   TestDynamicClient
I0114 16:06:21.182919  106102 controller.go:180] Shutting down kubernetes service endpoint reconciler
I0114 16:06:21.183409  106102 dynamic_cafile_content.go:181] Shutting down request-header::/tmp/kubernetes-kube-apiserver633882346/proxy-ca.crt
I0114 16:06:21.183528  106102 controller.go:123] Shutting down OpenAPI controller
I0114 16:06:21.183683  106102 nonstructuralschema_controller.go:197] Shutting down NonStructuralSchemaConditionController
I0114 16:06:21.183775  106102 naming_controller.go:300] Shutting down NamingConditionController
I0114 16:06:21.183859  106102 apiapproval_controller.go:196] Shutting down KubernetesAPIApprovalPolicyConformantConditionController
I0114 16:06:21.183945  106102 establishing_controller.go:85] Shutting down EstablishingController
I0114 16:06:21.184031  106102 customresource_discovery_controller.go:220] Shutting down DiscoveryController
I0114 16:06:21.184118  106102 apiservice_controller.go:106] Shutting down APIServiceRegistrationController
I0114 16:06:21.184204  106102 crd_finalizer.go:276] Shutting down CRDFinalizer
I0114 16:06:21.184288  106102 available_controller.go:398] Shutting down AvailableConditionController
I0114 16:06:21.184375  106102 cluster_authentication_trust_controller.go:463] Shutting down cluster_authentication_trust_controller controller
I0114 16:06:21.184392  106102 autoregister_controller.go:164] Shutting down autoregister controller
I0114 16:06:21.184409  106102 crdregistration_controller.go:142] Shutting down crd-autoregister controller
I0114 16:06:21.184568  106102 controller.go:87] Shutting down OpenAPI AggregationController
I0114 16:06:21.184620  106102 dynamic_cafile_content.go:181] Shutting down request-header::/tmp/kubernetes-kube-apiserver633882346/proxy-ca.crt
I0114 16:06:21.184642  106102 dynamic_cafile_content.go:181] Shutting down client-ca-bundle::/tmp/kubernetes-kube-apiserver633882346/client-ca.crt
I0114 16:06:21.184741  106102 secure_serving.go:222] Stopped listening on 127.0.0.1:35591
I0114 16:06:21.184752  106102 tlsconfig.go:256] Shutting down DynamicServingCertificateController
I0114 16:06:21.184793  106102 dynamic_serving_content.go:144] Shutting down serving-cert::/tmp/kubernetes-kube-apiserver633882346/apiserver.crt::/tmp/kubernetes-kube-apiserver633882346/apiserver.key
I0114 16:06:21.184818  106102 dynamic_cafile_content.go:181] Shutting down client-ca-bundle::/tmp/kubernetes-kube-apiserver633882346/client-ca.crt
E0114 16:06:21.187215  106102 reflector.go:320] k8s.io/kube-aggregator/pkg/client/informers/externalversions/factory.go:117: Failed to watch *v1.APIService: Get https://127.0.0.1:35591/apis/apiregistration.k8s.io/v1/apiservices?allowWatchBookmarks=true&resourceVersion=7699&timeout=5m53s&timeoutSeconds=353&watch=true: dial tcp 127.0.0.1:35591: connect: connection refused
I0114 16:06:22.137951  106102 serving.go:307] Generated self-signed cert (/tmp/kubernetes-kube-apiserver144043841/apiserver.crt, /tmp/kubernetes-kube-apiserver144043841/apiserver.key)
I0114 16:06:22.137978  106102 server.go:596] external host was not specified, using 127.0.0.1
W0114 16:06:22.137989  106102 authentication.go:439] AnonymousAuth is not allowed with the AlwaysAllow authorizer. Resetting AnonymousAuth to false. You should use a different authorizer
E0114 16:06:22.771830  106102 controller.go:183] an error on the server ("") has prevented the request from succeeding (get endpoints kubernetes)
W0114 16:06:22.866774  106102 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 16:06:22.866803  106102 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 16:06:22.866812  106102 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 16:06:22.866958  106102 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 16:06:22.867778  106102 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 16:06:22.867812  106102 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 16:06:22.867832  106102 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 16:06:22.867850  106102 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 16:06:22.868018  106102 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 16:06:22.868175  106102 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 16:06:22.868209  106102 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 16:06:22.868258  106102 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0114 16:06:22.868279  106102 plugins.go:158] Loaded 9 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,MutatingAdmissionWebhook,RuntimeClass.
I0114 16:06:22.868290  106102 plugins.go:161] Loaded 6 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ValidatingAdmissionWebhook,RuntimeClass,ResourceQuota.
I0114 16:06:22.869312  106102 plugins.go:158] Loaded 9 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,MutatingAdmissionWebhook,RuntimeClass.
I0114 16:06:22.869335  106102 plugins.go:161] Loaded 6 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ValidatingAdmissionWebhook,RuntimeClass,ResourceQuota.
I0114 16:06:22.870500  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:22.870529  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:22.871424  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:22.871462  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W0114 16:06:22.898629  106102 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0114 16:06:22.899410  106102 master.go:264] Using reconciler: lease
I0114 16:06:22.899653  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:22.899684  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:22.901801  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:22.901834  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:22.902816  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:22.902852  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:22.903766  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:22.903797  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:22.904626  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:22.904658  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:22.905697  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:22.905731  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:22.906700  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:22.906757  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:22.907626  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:22.907655  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:22.908804  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:22.908830  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:22.909867  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:22.909900  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:22.910786  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:22.910814  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:22.911674  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:22.911726  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:22.912861  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:22.912887  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:22.913879  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:22.913908  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:22.914985  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:22.915014  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:22.916169  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:22.916234  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:22.917082  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:22.917110  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:22.917958  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:22.917974  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:22.918693  106102 rest.go:113] the default service ipfamily for this cluster is: IPv4
I0114 16:06:23.052969  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.053009  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.054475  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.054532  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.055649  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.055680  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.056524  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.056558  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.057437  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.057569  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.058813  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.058843  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.059875  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.059919  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.061157  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.061186  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.062310  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.062358  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.063389  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.063419  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.064380  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.064407  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.065833  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.065868  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.066684  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.066722  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.068000  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.068065  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.068946  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.068979  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.070007  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.070039  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.071389  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.071509  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.072592  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.072625  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.074359  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.074389  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.076754  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.076909  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.078331  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.078464  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.079567  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.079601  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.080538  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.080570  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.081568  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.081594  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.082538  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.082683  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.083687  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.083720  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.084486  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.084517  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.085841  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.085872  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.086725  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.086776  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.087686  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.087711  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.090113  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.090147  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.091147  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.091179  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.092246  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.092279  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.093457  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.093634  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.095038  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.095069  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.096198  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.096224  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.097642  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.097674  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.098554  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.098583  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.099674  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.099710  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.100852  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.100884  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.101958  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.102359  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.103331  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.103368  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.104348  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.104489  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.105967  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.105998  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.107558  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.107594  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.108784  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.108817  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.110109  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.110143  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.111457  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.111485  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.112548  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.112579  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.114992  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.115028  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.116042  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.116076  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.117038  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.117075  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.117881  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.117909  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.119636  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.119675  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W0114 16:06:23.358653  106102 genericapiserver.go:404] Skipping API discovery.k8s.io/v1alpha1 because it has no resources.
W0114 16:06:23.515913  106102 genericapiserver.go:404] Skipping API apps/v1beta2 because it has no resources.
W0114 16:06:23.515962  106102 genericapiserver.go:404] Skipping API apps/v1beta1 because it has no resources.
I0114 16:06:23.532415  106102 plugins.go:158] Loaded 9 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,MutatingAdmissionWebhook,RuntimeClass.
I0114 16:06:23.532453  106102 plugins.go:161] Loaded 6 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ValidatingAdmissionWebhook,RuntimeClass,ResourceQuota.
W0114 16:06:23.533997  106102 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0114 16:06:23.534253  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.534291  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 16:06:23.535340  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.535375  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W0114 16:06:23.538326  106102 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0114 16:06:23.539043  106102 aggregator.go:182] Skipping APIService creation for flowcontrol.apiserver.k8s.io/v1alpha1
I0114 16:06:23.866524  106102 client.go:361] parsed scheme: "endpoint"
I0114 16:06:23.866620  106102 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
W0114 16:06:25.185081  106102 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.MutatingWebhookConfiguration ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0114 16:06:25.185101  106102 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1beta1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0114 16:06:25.185156  106102 reflector.go:340] k8s.io/kubernetes/pkg/master/controller/clusterauthenticationtrust/cluster_authentication_trust_controller.go:444: watch of *v1.ConfigMap ended with: very short watch: k8s.io/kubernetes/pkg/master/controller/clusterauthenticationtrust/cluster_authentication_trust_controller.go:444: Unexpected watch close - watch lasted less than a second and no items received
W0114 16:06:25.185169  106102 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.ServiceAccount ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0114 16:06:25.185201  106102 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.Endpoints ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0114 16:06:25.185235  106102 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.ValidatingWebhookConfiguration ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0114 16:06:25.185271  106102 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.Pod ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0114 16:06:25.185278  106102 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.Secret ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
W0114 16:06:25.185081  106102 reflector.go:340] k8s.io/apiextensions-apiserver/pkg/client/informers/externalversions/factory.go:117: watch of *v1.CustomResourceDefinition ended with: very short watch: k8s.io/apiextensions-apiserver/pkg/client/informers/externalversions/factory.go:117: Unexpected watch close - watch lasted less than a second and no items received
W0114 16:06:25.185328  106102 reflector.go:340] k8s.io/client-go/informers/factory.go:135: watch of *v1.StorageClass ended with: very short watch: k8s.io/client-go/informers/factory.go:135: Unexpected watch close - watch lasted less than a second and no items received
I0114 16:06:26.907012  106102 dynamic_cafile_content.go:166] Starting request-header::/tmp/kubernetes-kube-apiserver144043841/proxy-ca.crt
I0114 16:06:26.907077  106102 dynamic_cafile_content.go:166] Starting client-ca-bundle::/tmp/kubernetes-kube-apiserver144043841/client-ca.crt
I0114 16:06:26.907376  106102 dynamic_serving_content.go:129] Starting serving-cert::/tmp/kubernetes-kube-apiserver144043841/apiserver.crt::/tmp/kubernetes-kube-apiserver144043841/apiserver.key
I0114 16:06:26.908060  106102 secure_serving.go:178] Serving securely on 127.0.0.1:41927
I0114 16:06:26.908142  106102 autoregister_controller.go:140] Starting autoregister controller
I0114 16:06:26.908218  106102 cache.go:32] Waiting for caches to sync for autoregister controller
I0114 16:06:26.908235  106102 apiservice_controller.go:94] Starting APIServiceRegistrationController
I0114 16:06:26.908311  106102 cache.go:32] Waiting for caches to sync for APIServiceRegistrationController controller
I0114 16:06:26.908267  106102 available_controller.go:386] Starting AvailableConditionController
I0114 16:06:26.908383  106102 cache.go:32] Waiting for caches to sync for AvailableConditionController controller
I0114 16:06:26.908280  106102 crdregistration_controller.go:111] Starting crd-autoregister controller
I0114 16:06:26.908456  106102 shared_informer.go:206] Waiting for caches to sync for crd-autoregister
I0114 16:06:26.908204  106102 controller.go:81] Starting OpenAPI AggregationController
I0114 16:06:26.908277  106102 tlsconfig.go:241] Starting DynamicServingCertificateController
I0114 16:06:26.922289  106102 crd_finalizer.go:264] Starting CRDFinalizer
I0114 16:06:26.922623  106102 customresource_discovery_controller.go:209] Starting DiscoveryController
I0114 16:06:26.922654  106102 naming_controller.go:289] Starting NamingConditionController
I0114 16:06:26.922672  106102 establishing_controller.go:74] Starting EstablishingController
I0114 16:06:26.922863  106102 nonstructuralschema_controller.go:185] Starting NonStructuralSchemaConditionController
I0114 16:06:26.923002  106102 apiapproval_controller.go:184] Starting KubernetesAPIApprovalPolicyConformantConditionController
W0114 16:06:26.924402  106102 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0114 16:06:26.925156  106102 cluster_authentication_trust_controller.go:440] Starting cluster_authentication_trust_controller controller
I0114 16:06:26.925414  106102 shared_informer.go:206] Waiting for caches to sync for cluster_authentication_trust_controller
I0114 16:06:26.926079  106102 dynamic_cafile_content.go:166] Starting client-ca-bundle::/tmp/kubernetes-kube-apiserver144043841/client-ca.crt
I0114 16:06:26.926135  106102 dynamic_cafile_content.go:166] Starting request-header::/tmp/kubernetes-kube-apiserver144043841/proxy-ca.crt
I0114 16:06:26.922606  106102 controller.go:86] Starting OpenAPI controller
E0114 16:06:26.922417  106102 controller.go:151] Unable to remove old endpoints from kubernetes service: StorageError: key not found, Code: 1, Key: /7c5bfd04-b5e5-476d-8e5b-c8020b42003e/registry/masterleases/127.0.0.1, ResourceVersion: 0, AdditionalErrorMsg: 
E0114 16:06:26.929387  106102 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0114 16:06:26.935532  106102 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0114 16:06:26.944245  106102 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0114 16:06:26.951711  106102 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0114 16:06:26.953929  106102 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0114 16:06:26.972731  106102 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
I0114 16:06:27.008497  106102 cache.go:39] Caches are synced for autoregister controller
I0114 16:06:27.008539  106102 cache.go:39] Caches are synced for AvailableConditionController controller
I0114 16:06:27.008580  106102 cache.go:39] Caches are synced for APIServiceRegistrationController controller
I0114 16:06:27.009044  106102 shared_informer.go:213] Caches are synced for crd-autoregister 
I0114 16:06:27.025772  106102 shared_informer.go:213] Caches are synced for cluster_authentication_trust_controller 
I0114 16:06:27.907011  106102 controller.go:107] OpenAPI AggregationController: Processing item 
I0114 16:06:27.907047  106102 controller.go:130] OpenAPI AggregationController: action for item : Nothing (removed from the queue).
I0114 16:06:27.907064  106102 controller.go:130] OpenAPI AggregationController: action for item k8s_internal_local_delegation_chain_0000000000: Nothing (removed from the queue).
I0114 16:06:27.914967  106102 storage_scheduling.go:133] created PriorityClass system-node-critical with value 2000001000
I0114 16:06:27.919130  106102 storage_scheduling.go:133] created PriorityClass system-cluster-critical with value 2000000000
I0114 16:06:27.919154  106102 storage_scheduling.go:142] all system priority classes are created successfully or already exist.
W0114 16:06:27.972458  106102 lease.go:224] Resetting endpoints for master service "kubernetes" to [127.0.0.1]
E0114 16:06:27.974080  106102 controller.go:222] unable to sync kubernetes service: Endpoints "kubernetes" is invalid: subsets[0].addresses[0].ip: Invalid value: "127.0.0.1": may not be in the loopback range (127.0.0.0/8)
W0114 16:06:28.066584  106102 cacher.go:162] Terminating all watchers from cacher *apiextensions.CustomResourceDefinition
W0114 16:06:28.067006  106102 cacher.go:162] Terminating all watchers from cacher *core.LimitRange
W0114 16:06:28.067070  106102 cacher.go:162] Terminating all watchers from cacher *core.ResourceQuota
W0114 16:06:28.067342  106102 cacher.go:162] Terminating all watchers from cacher *core.Secret
W0114 16:06:28.068268  106102 cacher.go:162] Terminating all watchers from cacher *core.ConfigMap
W0114 16:06:28.068645  106102 cacher.go:162] Terminating all watchers from cacher *core.Namespace
W0114 16:06:28.068777  106102 cacher.go:162] Terminating all watchers from cacher *core.Endpoints
W0114 16:06:28.069020  106102 cacher.go:162] Terminating all watchers from cacher *core.Pod
W0114 16:06:28.069123  106102 cacher.go:162] Terminating all watchers from cacher *core.ServiceAccount
W0114 16:06:28.069367  106102 cacher.go:162] Terminating all watchers from cacher *core.Service
W0114 16:06:28.071348  106102 cacher.go:162] Terminating all watchers from cacher *node.RuntimeClass
W0114 16:06:28.073281  106102 cacher.go:162] Terminating all watchers from cacher *scheduling.PriorityClass
W0114 16:06:28.074078  106102 cacher.go:162] Terminating all watchers from cacher *storage.StorageClass
W0114 16:06:28.075631  106102 cacher.go:162] Terminating all watchers from cacher *admissionregistration.ValidatingWebhookConfiguration
W0114 16:06:28.075796  106102 cacher.go:162] Terminating all watchers from cacher *admissionregistration.MutatingWebhookConfiguration
W0114 16:06:28.076177  106102 cacher.go:162] Terminating all watchers from cacher *apiregistration.APIService
I0114 16:06:28.076723  106102 controller.go:180] Shutting down kubernetes service endpoint reconciler
--- FAIL: TestDynamicClient (6.89s)
    testserver.go:181: runtime-config=map[api/all:true]
    testserver.go:182: Starting kube-apiserver on port 41927...
    testserver.go:198: Waiting for /healthz to be ok...
    dynamic_client_test.go:88: unexpected pod in list. wanted &v1.Pod{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"test5zrxk", GenerateName:"test", Namespace:"default", SelfLink:"/api/v1/namespaces/default/pods/test5zrxk", UID:"ccd16e83-dc94-4a95-8822-9dd38c054556", ResourceVersion:"8163", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63714614788, loc:(*time.Location)(0x753ec80)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"client.test", Operation:"Update", APIVersion:"v1", Time:(*v1.Time)(0xc03adbc760), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc03adbc780)}}}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"test", Image:"test-image", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"Always", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc03ab3bd58), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc0391fd7a0), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(nil), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration{v1.Toleration{Key:"node.kubernetes.io/not-ready", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc03ab3bde0)}, v1.Toleration{Key:"node.kubernetes.io/unreachable", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc03ab3be20)}}, HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(0xc03ab3be28), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(0xc03ab3be2c), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}, Status:v1.PodStatus{Phase:"Pending", Conditions:[]v1.PodCondition(nil), Message:"", Reason:"", NominatedNodeName:"", HostIP:"", PodIP:"", PodIPs:[]v1.PodIP(nil), StartTime:(*v1.Time)(nil), InitContainerStatuses:[]v1.ContainerStatus(nil), ContainerStatuses:[]v1.ContainerStatus(nil), QOSClass:"BestEffort", EphemeralContainerStatuses:[]v1.ContainerStatus(nil)}}, got &v1.Pod{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"test5zrxk", GenerateName:"test", Namespace:"default", SelfLink:"/api/v1/namespaces/default/pods/test5zrxk", UID:"ccd16e83-dc94-4a95-8822-9dd38c054556", ResourceVersion:"8163", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63714614788, loc:(*time.Location)(0x753ec80)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"client.test", Operation:"Update", APIVersion:"v1", Time:(*v1.Time)(0xc03a85e7c0), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc03a85e7a0)}}}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"test", Image:"test-image", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"Always", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc03a905028), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc03915b3e0), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(nil), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration{v1.Toleration{Key:"node.kubernetes.io/not-ready", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc03a905090)}, v1.Toleration{Key:"node.kubernetes.io/unreachable", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc03a9050c0)}}, HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(0xc03a904fd8), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(0xc03a904fa9), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}, Status:v1.PodStatus{Phase:"Pending", Conditions:[]v1.PodCondition(nil), Message:"", Reason:"", NominatedNodeName:"", HostIP:"", PodIP:"", PodIPs:[]v1.PodIP(nil), StartTime:(*v1.Time)(nil), InitContainerStatuses:[]v1.ContainerStatus(nil), ContainerStatuses:[]v1.ContainerStatus(nil), QOSClass:"BestEffort", EphemeralContainerStatuses:[]v1.ContainerStatus(nil)}}

				from junit_20200114-160406.xml

Find in mentions in log files | View test history on testgrid


Show 2610 Passed Tests

Show 4 Skipped Tests

Error lines from build-log.txt

... skipping 56 lines ...
Recording: record_command_canary
Running command: record_command_canary

+++ Running case: test-cmd.record_command_canary 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: record_command_canary
/home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh: line 155: bogus-expected-to-fail: command not found
!!! [0114 15:54:36] Call tree:
!!! [0114 15:54:36]  1: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:47 record_command_canary(...)
!!! [0114 15:54:36]  2: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:112 eVal(...)
!!! [0114 15:54:36]  3: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:131 juLog(...)
!!! [0114 15:54:36]  4: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:159 record_command(...)
!!! [0114 15:54:36]  5: hack/make-rules/test-cmd.sh:35 source(...)
+++ exit code: 1
+++ error: 1
+++ [0114 15:54:36] Running kubeadm tests
+++ [0114 15:54:41] Building go targets for linux/amd64:
    cmd/kubeadm
+++ [0114 15:55:24] Running tests without code coverage
{"Time":"2020-01-14T15:56:45.977060369Z","Action":"output","Package":"k8s.io/kubernetes/cmd/kubeadm/test/cmd","Output":"ok  \tk8s.io/kubernetes/cmd/kubeadm/test/cmd\t44.899s\n"}
✓  cmd/kubeadm/test/cmd (44.9s)
... skipping 302 lines ...
+++ [0114 15:58:30] Building kube-controller-manager
+++ [0114 15:58:34] Building go targets for linux/amd64:
    cmd/kube-controller-manager
+++ [0114 15:59:02] Starting controller-manager
Flag --port has been deprecated, see --secure-port instead.
I0114 15:59:03.504447   54489 serving.go:313] Generated self-signed cert in-memory
W0114 15:59:03.879568   54489 authentication.go:409] failed to read in-cluster kubeconfig for delegated authentication: open /var/run/secrets/kubernetes.io/serviceaccount/token: no such file or directory
W0114 15:59:03.879711   54489 authentication.go:267] No authentication-kubeconfig provided in order to lookup client-ca-file in configmap/extension-apiserver-authentication in kube-system, so client certificate authentication won't work.
W0114 15:59:03.879722   54489 authentication.go:291] No authentication-kubeconfig provided in order to lookup requestheader-client-ca-file in configmap/extension-apiserver-authentication in kube-system, so request-header client certificate authentication won't work.
W0114 15:59:03.879746   54489 authorization.go:177] failed to read in-cluster kubeconfig for delegated authorization: open /var/run/secrets/kubernetes.io/serviceaccount/token: no such file or directory
W0114 15:59:03.879762   54489 authorization.go:146] No authorization-kubeconfig provided, so SubjectAccessReview of authorization tokens won't work.
I0114 15:59:03.879820   54489 controllermanager.go:161] Version: v1.18.0-alpha.1.677+c9003a268dff67
I0114 15:59:03.882011   54489 secure_serving.go:178] Serving securely on [::]:10257
I0114 15:59:03.882119   54489 tlsconfig.go:241] Starting DynamicServingCertificateController
I0114 15:59:03.882543   54489 deprecated_insecure_serving.go:53] Serving insecurely on [::]:10252
I0114 15:59:03.882607   54489 leaderelection.go:242] attempting to acquire leader lease  kube-system/kube-controller-manager...
... skipping 67 lines ...
I0114 15:59:04.156619   54489 controllermanager.go:533] Started "horizontalpodautoscaling"
I0114 15:59:04.156837   54489 horizontal.go:168] Starting HPA controller
I0114 15:59:04.156857   54489 cleaner.go:81] Starting CSR cleaner controller
I0114 15:59:04.156861   54489 shared_informer.go:206] Waiting for caches to sync for HPA
I0114 15:59:04.156842   54489 controllermanager.go:533] Started "csrcleaner"
W0114 15:59:04.157214   54489 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
E0114 15:59:04.157274   54489 core.go:90] Failed to start service controller: WARNING: no cloud provider provided, services of type LoadBalancer will fail
W0114 15:59:04.157282   54489 controllermanager.go:525] Skipping "service"
I0114 15:59:04.157514   54489 node_lifecycle_controller.go:77] Sending events to api server
E0114 15:59:04.157554   54489 core.go:231] failed to start cloud node lifecycle controller: no cloud provider provided
W0114 15:59:04.157564   54489 controllermanager.go:525] Skipping "cloud-node-lifecycle"
W0114 15:59:04.157789   54489 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
I0114 15:59:04.158077   54489 controllermanager.go:533] Started "persistentvolume-expander"
I0114 15:59:04.158204   54489 expand_controller.go:319] Starting expand controller
I0114 15:59:04.158226   54489 shared_informer.go:206] Waiting for caches to sync for expand
I0114 15:59:04.158465   54489 controllermanager.go:533] Started "daemonset"
... skipping 92 lines ...
I0114 15:59:04.931764   54489 controllermanager.go:533] Started "serviceaccount"
I0114 15:59:04.931883   54489 serviceaccounts_controller.go:116] Starting service account controller
I0114 15:59:04.931900   54489 shared_informer.go:206] Waiting for caches to sync for service account
I0114 15:59:04.932417   54489 controllermanager.go:533] Started "persistentvolume-binder"
I0114 15:59:04.933008   54489 pv_controller_base.go:294] Starting persistent volume controller
I0114 15:59:04.933026   54489 shared_informer.go:206] Waiting for caches to sync for persistent volume
W0114 15:59:04.959987   54489 actual_state_of_world.go:506] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="127.0.0.1" does not exist
{
  "major": "1",
  "minor": "18+",
  "gitVersion": "v1.18.0-alpha.1.677+c9003a268dff67",
  "gitCommit": "c9003a268dff6700506929b247847341ea4d5b33",
  "gitTreeState": "clean",
... skipping 8 lines ...
I0114 15:59:05.051366   54489 shared_informer.go:213] Caches are synced for certificate-csrapproving 
I0114 15:59:05.051642   54489 shared_informer.go:213] Caches are synced for TTL 
I0114 15:59:05.052751   54489 shared_informer.go:213] Caches are synced for GC 
I0114 15:59:05.055669   54489 shared_informer.go:213] Caches are synced for PV protection 
I0114 15:59:05.057016   54489 shared_informer.go:213] Caches are synced for HPA 
I0114 15:59:05.062416   54489 shared_informer.go:213] Caches are synced for ClusterRoleAggregator 
E0114 15:59:05.070301   54489 clusterroleaggregation_controller.go:180] edit failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "edit": the object has been modified; please apply your changes to the latest version and try again
E0114 15:59:05.071374   54489 clusterroleaggregation_controller.go:180] view failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "view": the object has been modified; please apply your changes to the latest version and try again
E0114 15:59:05.078183   54489 clusterroleaggregation_controller.go:180] edit failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "edit": the object has been modified; please apply your changes to the latest version and try again
+++ [0114 15:59:05] Testing kubectl version: check client only output matches expected output
I0114 15:59:05.162451   54489 shared_informer.go:213] Caches are synced for endpoint 
Successful: the flag '--client' shows correct client info
(BSuccessful: the flag '--client' correctly has no server version info
(B+++ [0114 15:59:05] Testing kubectl version: verify json output
I0114 15:59:05.350008   54489 shared_informer.go:213] Caches are synced for taint 
... skipping 74 lines ...
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_RESTMapper_evaluation_tests
+++ [0114 15:59:08] Creating namespace namespace-1579017548-17342
namespace/namespace-1579017548-17342 created
Context "test" modified.
+++ [0114 15:59:08] Testing RESTMapper
+++ [0114 15:59:09] "kubectl get unknownresourcetype" returns error as expected: error: the server doesn't have a resource type "unknownresourcetype"
+++ exit code: 0
NAME                              SHORTNAMES   APIGROUP                       NAMESPACED   KIND
bindings                                                                      true         Binding
componentstatuses                 cs                                          false        ComponentStatus
configmaps                        cm                                          true         ConfigMap
endpoints                         ep                                          true         Endpoints
... skipping 650 lines ...
has:valid-pod
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          0s
has:valid-pod
core.sh:186: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Berror: resource(s) were provided, but no name, label selector, or --all flag specified
core.sh:190: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:194: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Berror: setting 'all' parameter but found a non empty selector. 
core.sh:198: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
core.sh:206: Successful get pods -l'name in (valid-pod)' {{range.items}}{{.metadata.name}}:{{end}}: 
(Bcore.sh:211: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-kubectl-describe-pod\" }}found{{end}}{{end}}:: :
... skipping 12 lines ...
(Bpoddisruptionbudget.policy/test-pdb-2 created
core.sh:245: Successful get pdb/test-pdb-2 --namespace=test-kubectl-describe-pod {{.spec.minAvailable}}: 50%
(Bpoddisruptionbudget.policy/test-pdb-3 created
core.sh:251: Successful get pdb/test-pdb-3 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 2
(Bpoddisruptionbudget.policy/test-pdb-4 created
core.sh:255: Successful get pdb/test-pdb-4 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 50%
(Berror: min-available and max-unavailable cannot be both specified
core.sh:261: Successful get pods --namespace=test-kubectl-describe-pod {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/env-test-pod created
matched TEST_CMD_1
matched <set to the key 'key-1' in secret 'test-secret'>
matched TEST_CMD_2
matched <set to the key 'key-2' of config map 'test-configmap'>
... skipping 188 lines ...
(Bpod/valid-pod patched
core.sh:470: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: changed-with-yaml:
(Bpod/valid-pod patched
core.sh:475: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:3.1:
(Bpod/valid-pod patched
core.sh:491: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
(B+++ [0114 15:59:45] "kubectl patch with resourceVersion 521" returns error as expected: Error from server (Conflict): Operation cannot be fulfilled on pods "valid-pod": the object has been modified; please apply your changes to the latest version and try again
pod "valid-pod" deleted
pod/valid-pod replaced
core.sh:515: Successful get pod valid-pod {{(index .spec.containers 0).name}}: replaced-k8s-serve-hostname
(BSuccessful
message:error: --grace-period must have --force specified
has:\-\-grace-period must have \-\-force specified
Successful
message:error: --timeout must have --force specified
has:\-\-timeout must have \-\-force specified
node/node-v1-test created
W0114 15:59:45.896655   54489 actual_state_of_world.go:506] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="node-v1-test" does not exist
node/node-v1-test replaced
core.sh:552: Successful get node node-v1-test {{.metadata.annotations.a}}: b
(Bnode "node-v1-test" deleted
core.sh:559: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
(Bcore.sh:562: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/serve_hostname:
(BEdit cancelled, no changes made.
... skipping 22 lines ...
spec:
  containers:
  - image: k8s.gcr.io/pause:2.0
    name: kubernetes-pause
has:localonlyvalue
core.sh:585: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Berror: 'name' already has a value (valid-pod), and --overwrite is false
core.sh:589: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Bcore.sh:593: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Bpod/valid-pod labeled
core.sh:597: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod-super-sayan
(Bcore.sh:601: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
... skipping 85 lines ...
+++ Running case: test-cmd.run_kubectl_create_error_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_create_error_tests
+++ [0114 15:59:55] Creating namespace namespace-1579017595-23687
namespace/namespace-1579017595-23687 created
Context "test" modified.
+++ [0114 15:59:55] Testing kubectl create with error
Error: must specify one of -f and -k

Create a resource from a file or from stdin.

 JSON and YAML formats are accepted.

Examples:
... skipping 41 lines ...

Usage:
  kubectl create -f FILENAME [options]

Use "kubectl <command> --help" for more information about a given command.
Use "kubectl options" for a list of global command-line options (applies to all commands).
+++ [0114 15:59:55] "kubectl create with empty string list returns error as expected: error: error validating "hack/testdata/invalid-rc-with-empty-args.yaml": error validating data: ValidationError(ReplicationController.spec.template.spec.containers[0].args): unknown object type "nil" in ReplicationController.spec.template.spec.containers[0].args[0]; if you choose to ignore these errors, turn validation off with --validate=false
kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
+++ exit code: 0
Recording: run_kubectl_apply_tests
Running command: run_kubectl_apply_tests

... skipping 17 lines ...
(Bpod "test-pod" deleted
customresourcedefinition.apiextensions.k8s.io/resources.mygroup.example.com created
I0114 15:59:58.467892   51025 client.go:361] parsed scheme: "endpoint"
I0114 15:59:58.469885   51025 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I0114 15:59:58.474030   51025 controller.go:606] quota admission added evaluator for: resources.mygroup.example.com
kind.mygroup.example.com/myobj serverside-applied (server dry run)
Error from server (NotFound): resources.mygroup.example.com "myobj" not found
customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
+++ exit code: 0
Recording: run_kubectl_run_tests
Running command: run_kubectl_run_tests

+++ Running case: test-cmd.run_kubectl_run_tests 
... skipping 102 lines ...
Context "test" modified.
+++ [0114 16:00:01] Testing kubectl create filter
create.sh:30: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/selector-test-pod created
create.sh:34: Successful get pods selector-test-pod {{.metadata.labels.name}}: selector-test-pod
(BSuccessful
message:Error from server (NotFound): pods "selector-test-pod-dont-apply" not found
has:pods "selector-test-pod-dont-apply" not found
pod "selector-test-pod" deleted
+++ exit code: 0
Recording: run_kubectl_apply_deployments_tests
Running command: run_kubectl_apply_deployments_tests

... skipping 29 lines ...
I0114 16:00:03.780560   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017601-28231", Name:"nginx", UID:"38d456cb-b2be-4d01-9771-5d4086ea99e5", APIVersion:"apps/v1", ResourceVersion:"609", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-8484dd655 to 3
I0114 16:00:03.783021   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017601-28231", Name:"nginx-8484dd655", UID:"33d9532f-60aa-43a8-afe4-e3e2733b98ee", APIVersion:"apps/v1", ResourceVersion:"610", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-8484dd655-kpcnd
I0114 16:00:03.786451   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017601-28231", Name:"nginx-8484dd655", UID:"33d9532f-60aa-43a8-afe4-e3e2733b98ee", APIVersion:"apps/v1", ResourceVersion:"610", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-8484dd655-szcs6
I0114 16:00:03.786859   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017601-28231", Name:"nginx-8484dd655", UID:"33d9532f-60aa-43a8-afe4-e3e2733b98ee", APIVersion:"apps/v1", ResourceVersion:"610", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-8484dd655-6jp89
apps.sh:148: Successful get deployment nginx {{.metadata.name}}: nginx
(BSuccessful
message:Error from server (Conflict): error when applying patch:
{"metadata":{"annotations":{"kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1579017601-28231\",\"resourceVersion\":\"99\"},\"spec\":{\"replicas\":3,\"selector\":{\"matchLabels\":{\"name\":\"nginx2\"}},\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx2\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n"},"resourceVersion":"99"},"spec":{"selector":{"matchLabels":{"name":"nginx2"}},"template":{"metadata":{"labels":{"name":"nginx2"}}}}}
to:
Resource: "apps/v1, Resource=deployments", GroupVersionKind: "apps/v1, Kind=Deployment"
Name: "nginx", Namespace: "namespace-1579017601-28231"
for: "hack/testdata/deployment-label-change2.yaml": Operation cannot be fulfilled on deployments.apps "nginx": the object has been modified; please apply your changes to the latest version and try again
has:Error from server (Conflict)
I0114 16:00:09.796963   54489 horizontal.go:353] Horizontal Pod Autoscaler frontend has been deleted in namespace-1579017592-7381
deployment.apps/nginx configured
I0114 16:00:13.331579   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017601-28231", Name:"nginx", UID:"8e680b21-391a-4382-8a3a-893c3c882d68", APIVersion:"apps/v1", ResourceVersion:"655", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-668b6c7744 to 3
I0114 16:00:13.335514   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017601-28231", Name:"nginx-668b6c7744", UID:"987ddb58-8289-4d00-a3ef-2148cb884883", APIVersion:"apps/v1", ResourceVersion:"656", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-668b6c7744-mrl2x
I0114 16:00:13.339348   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017601-28231", Name:"nginx-668b6c7744", UID:"987ddb58-8289-4d00-a3ef-2148cb884883", APIVersion:"apps/v1", ResourceVersion:"656", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-668b6c7744-c2c5x
I0114 16:00:13.339726   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017601-28231", Name:"nginx-668b6c7744", UID:"987ddb58-8289-4d00-a3ef-2148cb884883", APIVersion:"apps/v1", ResourceVersion:"656", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-668b6c7744-h2562
Successful
message:        "name": "nginx2"
          "name": "nginx2"
has:"name": "nginx2"
E0114 16:00:17.654219   54489 replica_set.go:534] sync "namespace-1579017601-28231/nginx-668b6c7744" failed with Operation cannot be fulfilled on replicasets.apps "nginx-668b6c7744": StorageError: invalid object, Code: 4, Key: /registry/replicasets/namespace-1579017601-28231/nginx-668b6c7744, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 987ddb58-8289-4d00-a3ef-2148cb884883, UID in object meta: 
I0114 16:00:18.642073   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017601-28231", Name:"nginx", UID:"ec6e0ad8-ed2f-4c65-8c55-56d91685d3cd", APIVersion:"apps/v1", ResourceVersion:"690", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-668b6c7744 to 3
I0114 16:00:18.645050   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017601-28231", Name:"nginx-668b6c7744", UID:"8cc862d3-87b8-49e4-8cd6-2d561673d97f", APIVersion:"apps/v1", ResourceVersion:"691", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-668b6c7744-zb84l
I0114 16:00:18.649682   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017601-28231", Name:"nginx-668b6c7744", UID:"8cc862d3-87b8-49e4-8cd6-2d561673d97f", APIVersion:"apps/v1", ResourceVersion:"691", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-668b6c7744-9js5j
I0114 16:00:18.649716   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017601-28231", Name:"nginx-668b6c7744", UID:"8cc862d3-87b8-49e4-8cd6-2d561673d97f", APIVersion:"apps/v1", ResourceVersion:"691", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-668b6c7744-bz9vp
Successful
message:The Deployment "nginx" is invalid: spec.template.metadata.labels: Invalid value: map[string]string{"name":"nginx3"}: `selector` does not match template `labels`
... skipping 132 lines ...
+++ [0114 16:00:20] Creating namespace namespace-1579017620-20556
namespace/namespace-1579017620-20556 created
Context "test" modified.
+++ [0114 16:00:20] Testing kubectl get
get.sh:29: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
get.sh:37: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
get.sh:45: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:{
    "apiVersion": "v1",
    "items": [],
... skipping 23 lines ...
has not:No resources found
Successful
message:NAME
has not:No resources found
get.sh:73: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:error: the server doesn't have a resource type "foobar"
has not:No resources found
Successful
message:No resources found in namespace-1579017620-20556 namespace.
has:No resources found
Successful
message:
has not:No resources found
Successful
message:No resources found in namespace-1579017620-20556 namespace.
has:No resources found
get.sh:93: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
Successful
message:Error from server (NotFound): pods "abc" not found
has not:List
Successful
message:I0114 16:00:22.474572   64987 loader.go:375] Config loaded from file:  /tmp/tmp.75KxkBT0hy/.kube/config
I0114 16:00:22.476117   64987 round_trippers.go:443] GET http://127.0.0.1:8080/version?timeout=32s 200 OK in 1 milliseconds
I0114 16:00:22.508910   64987 round_trippers.go:443] GET http://127.0.0.1:8080/api/v1/namespaces/default/pods 200 OK in 5 milliseconds
I0114 16:00:22.510975   64987 round_trippers.go:443] GET http://127.0.0.1:8080/api/v1/namespaces/default/replicationcontrollers 200 OK in 1 milliseconds
... skipping 479 lines ...
Successful
message:NAME    DATA   AGE
one     0      1s
three   0      1s
two     0      1s
STATUS    REASON          MESSAGE
Failure   InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has not:watch is only supported on individual resources
Successful
message:STATUS    REASON          MESSAGE
Failure   InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has not:watch is only supported on individual resources
+++ [0114 16:00:29] Creating namespace namespace-1579017629-14916
namespace/namespace-1579017629-14916 created
Context "test" modified.
get.sh:153: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/valid-pod created
... skipping 105 lines ...
}
get.sh:158: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(B<no value>Successful
message:valid-pod:
has:valid-pod:
Successful
message:error: error executing jsonpath "{.missing}": Error executing template: missing is not found. Printing more information for debugging the template:
	template was:
		{.missing}
	object given to jsonpath engine was:
		map[string]interface {}{"apiVersion":"v1", "kind":"Pod", "metadata":map[string]interface {}{"creationTimestamp":"2020-01-14T16:00:29Z", "labels":map[string]interface {}{"name":"valid-pod"}, "managedFields":[]interface {}{map[string]interface {}{"apiVersion":"v1", "fieldsType":"FieldsV1", "fieldsV1":map[string]interface {}{"f:metadata":map[string]interface {}{"f:labels":map[string]interface {}{".":map[string]interface {}{}, "f:name":map[string]interface {}{}}}, "f:spec":map[string]interface {}{"f:containers":map[string]interface {}{"k:{\"name\":\"kubernetes-serve-hostname\"}":map[string]interface {}{".":map[string]interface {}{}, "f:image":map[string]interface {}{}, "f:imagePullPolicy":map[string]interface {}{}, "f:name":map[string]interface {}{}, "f:resources":map[string]interface {}{".":map[string]interface {}{}, "f:limits":map[string]interface {}{".":map[string]interface {}{}, "f:cpu":map[string]interface {}{}, "f:memory":map[string]interface {}{}}, "f:requests":map[string]interface {}{".":map[string]interface {}{}, "f:cpu":map[string]interface {}{}, "f:memory":map[string]interface {}{}}}, "f:terminationMessagePath":map[string]interface {}{}, "f:terminationMessagePolicy":map[string]interface {}{}}}, "f:dnsPolicy":map[string]interface {}{}, "f:enableServiceLinks":map[string]interface {}{}, "f:priority":map[string]interface {}{}, "f:restartPolicy":map[string]interface {}{}, "f:schedulerName":map[string]interface {}{}, "f:securityContext":map[string]interface {}{}, "f:terminationGracePeriodSeconds":map[string]interface {}{}}}, "manager":"kubectl", "operation":"Update", "time":"2020-01-14T16:00:29Z"}}, "name":"valid-pod", "namespace":"namespace-1579017629-14916", "resourceVersion":"737", "selfLink":"/api/v1/namespaces/namespace-1579017629-14916/pods/valid-pod", "uid":"45565120-8cc3-466a-b207-709cefac0d54"}, "spec":map[string]interface {}{"containers":[]interface {}{map[string]interface {}{"image":"k8s.gcr.io/serve_hostname", "imagePullPolicy":"Always", "name":"kubernetes-serve-hostname", "resources":map[string]interface {}{"limits":map[string]interface {}{"cpu":"1", "memory":"512Mi"}, "requests":map[string]interface {}{"cpu":"1", "memory":"512Mi"}}, "terminationMessagePath":"/dev/termination-log", "terminationMessagePolicy":"File"}}, "dnsPolicy":"ClusterFirst", "enableServiceLinks":true, "priority":0, "restartPolicy":"Always", "schedulerName":"default-scheduler", "securityContext":map[string]interface {}{}, "terminationGracePeriodSeconds":30}, "status":map[string]interface {}{"phase":"Pending", "qosClass":"Guaranteed"}}
has:missing is not found
error: error executing template "{{.missing}}": template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing"
Successful
message:Error executing template: template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing". Printing more information for debugging the template:
	template was:
		{{.missing}}
	raw data was:
		{"apiVersion":"v1","kind":"Pod","metadata":{"creationTimestamp":"2020-01-14T16:00:29Z","labels":{"name":"valid-pod"},"managedFields":[{"apiVersion":"v1","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:labels":{".":{},"f:name":{}}},"f:spec":{"f:containers":{"k:{\"name\":\"kubernetes-serve-hostname\"}":{".":{},"f:image":{},"f:imagePullPolicy":{},"f:name":{},"f:resources":{".":{},"f:limits":{".":{},"f:cpu":{},"f:memory":{}},"f:requests":{".":{},"f:cpu":{},"f:memory":{}}},"f:terminationMessagePath":{},"f:terminationMessagePolicy":{}}},"f:dnsPolicy":{},"f:enableServiceLinks":{},"f:priority":{},"f:restartPolicy":{},"f:schedulerName":{},"f:securityContext":{},"f:terminationGracePeriodSeconds":{}}},"manager":"kubectl","operation":"Update","time":"2020-01-14T16:00:29Z"}],"name":"valid-pod","namespace":"namespace-1579017629-14916","resourceVersion":"737","selfLink":"/api/v1/namespaces/namespace-1579017629-14916/pods/valid-pod","uid":"45565120-8cc3-466a-b207-709cefac0d54"},"spec":{"containers":[{"image":"k8s.gcr.io/serve_hostname","imagePullPolicy":"Always","name":"kubernetes-serve-hostname","resources":{"limits":{"cpu":"1","memory":"512Mi"},"requests":{"cpu":"1","memory":"512Mi"}},"terminationMessagePath":"/dev/termination-log","terminationMessagePolicy":"File"}],"dnsPolicy":"ClusterFirst","enableServiceLinks":true,"priority":0,"restartPolicy":"Always","schedulerName":"default-scheduler","securityContext":{},"terminationGracePeriodSeconds":30},"status":{"phase":"Pending","qosClass":"Guaranteed"}}
	object given to template engine was:
		map[apiVersion:v1 kind:Pod metadata:map[creationTimestamp:2020-01-14T16:00:29Z labels:map[name:valid-pod] managedFields:[map[apiVersion:v1 fieldsType:FieldsV1 fieldsV1:map[f:metadata:map[f:labels:map[.:map[] f:name:map[]]] f:spec:map[f:containers:map[k:{"name":"kubernetes-serve-hostname"}:map[.:map[] f:image:map[] f:imagePullPolicy:map[] f:name:map[] f:resources:map[.:map[] f:limits:map[.:map[] f:cpu:map[] f:memory:map[]] f:requests:map[.:map[] f:cpu:map[] f:memory:map[]]] f:terminationMessagePath:map[] f:terminationMessagePolicy:map[]]] f:dnsPolicy:map[] f:enableServiceLinks:map[] f:priority:map[] f:restartPolicy:map[] f:schedulerName:map[] f:securityContext:map[] f:terminationGracePeriodSeconds:map[]]] manager:kubectl operation:Update time:2020-01-14T16:00:29Z]] name:valid-pod namespace:namespace-1579017629-14916 resourceVersion:737 selfLink:/api/v1/namespaces/namespace-1579017629-14916/pods/valid-pod uid:45565120-8cc3-466a-b207-709cefac0d54] spec:map[containers:[map[image:k8s.gcr.io/serve_hostname imagePullPolicy:Always name:kubernetes-serve-hostname resources:map[limits:map[cpu:1 memory:512Mi] requests:map[cpu:1 memory:512Mi]] terminationMessagePath:/dev/termination-log terminationMessagePolicy:File]] dnsPolicy:ClusterFirst enableServiceLinks:true priority:0 restartPolicy:Always schedulerName:default-scheduler securityContext:map[] terminationGracePeriodSeconds:30] status:map[phase:Pending qosClass:Guaranteed]]
has:map has no entry for key "missing"
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
STATUS      REASON          MESSAGE
Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has:STATUS
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
STATUS      REASON          MESSAGE
Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has:valid-pod
Successful
message:pod/valid-pod
status/<unknown>
has not:STATUS
Successful
... skipping 82 lines ...
      (Client.Timeout exceeded while reading body)'
    reason: UnexpectedServerResponse
  - message: 'unable to decode an event from the watch stream: net/http: request canceled
      (Client.Timeout exceeded while reading body)'
    reason: ClientWatchDecoding
kind: Status
message: 'an error on the server ("unable to decode an event from the watch stream:
  net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented
  the request from succeeding'
metadata: {}
reason: InternalError
status: Failure
has not:STATUS
... skipping 79 lines ...
      (Client.Timeout exceeded while reading body)'
    reason: UnexpectedServerResponse
  - message: 'unable to decode an event from the watch stream: net/http: request canceled
      (Client.Timeout exceeded while reading body)'
    reason: ClientWatchDecoding
kind: Status
message: 'an error on the server ("unable to decode an event from the watch stream:
  net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented
  the request from succeeding'
metadata: {}
reason: InternalError
status: Failure
has:name: valid-pod
Successful
message:Error from server (NotFound): pods "invalid-pod" not found
has:"invalid-pod" not found
pod "valid-pod" deleted
get.sh:196: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/redis-master created
pod/valid-pod created
Successful
... skipping 35 lines ...
+++ command: run_kubectl_exec_pod_tests
+++ [0114 16:00:35] Creating namespace namespace-1579017635-23695
namespace/namespace-1579017635-23695 created
Context "test" modified.
+++ [0114 16:00:35] Testing kubectl exec POD COMMAND
Successful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
pod/test-pod created
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pods "test-pod" not found
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pod or type/name must be specified
pod "test-pod" deleted
+++ exit code: 0
Recording: run_kubectl_exec_resource_name_tests
Running command: run_kubectl_exec_resource_name_tests

... skipping 2 lines ...
+++ command: run_kubectl_exec_resource_name_tests
+++ [0114 16:00:35] Creating namespace namespace-1579017635-9667
namespace/namespace-1579017635-9667 created
Context "test" modified.
+++ [0114 16:00:36] Testing kubectl exec TYPE/NAME COMMAND
Successful
message:error: the server doesn't have a resource type "foo"
has:error:
Successful
message:Error from server (NotFound): deployments.apps "bar" not found
has:"bar" not found
pod/test-pod created
replicaset.apps/frontend created
I0114 16:00:36.733139   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017635-9667", Name:"frontend", UID:"8620a6b7-eac7-490f-8337-3f25702fadb3", APIVersion:"apps/v1", ResourceVersion:"795", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-k8h55
I0114 16:00:36.735475   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017635-9667", Name:"frontend", UID:"8620a6b7-eac7-490f-8337-3f25702fadb3", APIVersion:"apps/v1", ResourceVersion:"795", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-pdl5s
I0114 16:00:36.736895   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017635-9667", Name:"frontend", UID:"8620a6b7-eac7-490f-8337-3f25702fadb3", APIVersion:"apps/v1", ResourceVersion:"795", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-kxrdv
configmap/test-set-env-config created
Successful
message:error: cannot attach to *v1.ConfigMap: selector for *v1.ConfigMap not implemented
has:not implemented
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:not found
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pod or type/name must be specified
Successful
message:Error from server (BadRequest): pod frontend-k8h55 does not have a host assigned
has not:not found
Successful
message:Error from server (BadRequest): pod frontend-k8h55 does not have a host assigned
has not:pod or type/name must be specified
pod "test-pod" deleted
replicaset.apps "frontend" deleted
configmap "test-set-env-config" deleted
+++ exit code: 0
Recording: run_create_secret_tests
Running command: run_create_secret_tests

+++ Running case: test-cmd.run_create_secret_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_create_secret_tests
Successful
message:Error from server (NotFound): secrets "mysecret" not found
has:secrets "mysecret" not found
Successful
message:Error from server (NotFound): secrets "mysecret" not found
has:secrets "mysecret" not found
Successful
message:user-specified
has:user-specified
Successful
{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","selfLink":"/api/v1/namespaces/default/configmaps/tester-update-cm","uid":"84d63c54-ebf8-4106-b9d3-983c901db53f","resourceVersion":"815","creationTimestamp":"2020-01-14T16:00:38Z"}}
... skipping 2 lines ...
has:uid
Successful
message:{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","selfLink":"/api/v1/namespaces/default/configmaps/tester-update-cm","uid":"84d63c54-ebf8-4106-b9d3-983c901db53f","resourceVersion":"816","creationTimestamp":"2020-01-14T16:00:38Z"},"data":{"key1":"config1"}}
has:config1
{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Success","details":{"name":"tester-update-cm","kind":"configmaps","uid":"84d63c54-ebf8-4106-b9d3-983c901db53f"}}
Successful
message:Error from server (NotFound): configmaps "tester-update-cm" not found
has:configmaps "tester-update-cm" not found
+++ exit code: 0
Recording: run_kubectl_create_kustomization_directory_tests
Running command: run_kubectl_create_kustomization_directory_tests

+++ Running case: test-cmd.run_kubectl_create_kustomization_directory_tests 
... skipping 159 lines ...
valid-pod   0/1     Pending   0          1s
has:valid-pod
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
STATUS      REASON          MESSAGE
Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has:Timeout exceeded while reading body
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          2s
has:valid-pod
Successful
message:error: Invalid timeout value. Timeout must be a single integer in seconds, or an integer followed by a corresponding time unit (e.g. 1s | 2m | 3h)
has:Invalid timeout value
pod "valid-pod" deleted
+++ exit code: 0
Recording: run_crd_tests
Running command: run_crd_tests

... skipping 158 lines ...
foo.company.com/test patched
crd.sh:236: Successful get foos/test {{.patched}}: value1
(Bfoo.company.com/test patched
crd.sh:238: Successful get foos/test {{.patched}}: value2
(Bfoo.company.com/test patched
crd.sh:240: Successful get foos/test {{.patched}}: <no value>
(B+++ [0114 16:00:48] "kubectl patch --local" returns error as expected for CustomResource: error: cannot apply strategic merge patch for company.com/v1, Kind=Foo locally, try --type merge
{
    "apiVersion": "company.com/v1",
    "kind": "Foo",
    "metadata": {
        "annotations": {
            "kubernetes.io/change-cause": "kubectl patch foos/test --server=http://127.0.0.1:8080 --match-server-version=true --patch={\"patched\":null} --type=merge --record=true"
... skipping 193 lines ...
(Bcrd.sh:450: Successful get bars {{range.items}}{{.metadata.name}}:{{end}}: 
(Bnamespace/non-native-resources created
bar.company.com/test created
crd.sh:455: Successful get bars {{len .items}}: 1
(Bnamespace "non-native-resources" deleted
crd.sh:458: Successful get bars {{len .items}}: 0
(BError from server (NotFound): namespaces "non-native-resources" not found
customresourcedefinition.apiextensions.k8s.io "foos.company.com" deleted
customresourcedefinition.apiextensions.k8s.io "bars.company.com" deleted
customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
customresourcedefinition.apiextensions.k8s.io "validfoos.company.com" deleted
+++ exit code: 0
Recording: run_cmd_with_img_tests
... skipping 11 lines ...
I0114 16:01:20.141424   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017679-4671", Name:"test1-6cdffdb5b8", UID:"a9c622d4-499a-4000-8beb-e1f65bee02e8", APIVersion:"apps/v1", ResourceVersion:"995", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test1-6cdffdb5b8-bk4w6
Successful
message:deployment.apps/test1 created
has:deployment.apps/test1 created
deployment.apps "test1" deleted
Successful
message:error: Invalid image name "InvalidImageName": invalid reference format
has:error: Invalid image name "InvalidImageName": invalid reference format
+++ exit code: 0
+++ [0114 16:01:20] Testing recursive resources
+++ [0114 16:01:20] Creating namespace namespace-1579017680-27368
namespace/namespace-1579017680-27368 created
W0114 16:01:20.497708   51025 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E0114 16:01:20.498950   54489 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
W0114 16:01:20.590131   51025 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E0114 16:01:20.591211   54489 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BW0114 16:01:20.702092   51025 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E0114 16:01:20.703361   54489 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
W0114 16:01:20.805339   51025 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E0114 16:01:20.806470   54489 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:206: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:pod/busybox0 created
pod/busybox1 created
error: error validating "hack/testdata/recursive/pod/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
generic-resources.sh:211: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:220: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: busybox:busybox:
(BSuccessful
message:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:227: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 16:01:21.500210   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:231: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
(BSuccessful
message:pod/busybox0 replaced
pod/busybox1 replaced
error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
E0114 16:01:21.592361   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:236: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 16:01:21.704382   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Name:         busybox0
Namespace:    namespace-1579017680-27368
Priority:     0
Node:         <none>
Labels:       app=busybox0
... skipping 153 lines ...
QoS Class:        BestEffort
Node-Selectors:   <none>
Tolerations:      <none>
Events:           <none>
unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
E0114 16:01:21.807661   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:246: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:250: Successful get pods {{range.items}}{{.metadata.annotations.annotatekey}}:{{end}}: annotatevalue:annotatevalue:
(BSuccessful
message:pod/busybox0 annotated
pod/busybox1 annotated
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:255: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:259: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
(BSuccessful
message:error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
generic-resources.sh:265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:01:22.501562   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx created
I0114 16:01:22.574414   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017680-27368", Name:"nginx", UID:"9e6ac8ae-ff22-4cef-bbb5-d24f00c8fa03", APIVersion:"apps/v1", ResourceVersion:"1018", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-f87d999f7 to 3
I0114 16:01:22.577972   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017680-27368", Name:"nginx-f87d999f7", UID:"0d96ecd1-3b64-4958-93fb-b55eb824382a", APIVersion:"apps/v1", ResourceVersion:"1019", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-5nk6j
I0114 16:01:22.580324   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017680-27368", Name:"nginx-f87d999f7", UID:"0d96ecd1-3b64-4958-93fb-b55eb824382a", APIVersion:"apps/v1", ResourceVersion:"1019", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-fhrbm
I0114 16:01:22.582133   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017680-27368", Name:"nginx-f87d999f7", UID:"0d96ecd1-3b64-4958-93fb-b55eb824382a", APIVersion:"apps/v1", ResourceVersion:"1019", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-bkrrl
E0114 16:01:22.593407   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:269: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(BE0114 16:01:22.705580   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:270: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0114 16:01:22.808732   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
generic-resources.sh:274: Successful get deployment nginx {{ .apiVersion }}: apps/v1
(BSuccessful
message:apiVersion: extensions/v1beta1
kind: Deployment
... skipping 40 lines ...
deployment.apps "nginx" deleted
generic-resources.sh:281: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:285: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:290: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:busybox0:busybox1:
Successful
message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
E0114 16:01:23.502883   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:299: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 16:01:23.594629   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/busybox0 labeled
pod/busybox1 labeled
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
E0114 16:01:23.706640   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:304: Successful get pods {{range.items}}{{.metadata.labels.mylabel}}:{{end}}: myvalue:myvalue:
(BSuccessful
message:pod/busybox0 labeled
pod/busybox1 labeled
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:309: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 16:01:23.810107   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/busybox0 patched
pod/busybox1 patched
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
generic-resources.sh:314: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: prom/busybox:prom/busybox:
(BSuccessful
message:pod/busybox0 patched
pod/busybox1 patched
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:319: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BI0114 16:01:24.233099   54489 namespace_controller.go:185] Namespace has been deleted non-native-resources
generic-resources.sh:323: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "busybox0" force deleted
pod "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:328: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/busybox0 created
replicationcontroller/busybox1 created
error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0114 16:01:24.502559   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017680-27368", Name:"busybox0", UID:"b1276733-ca1b-4678-988c-0da7d8e79ba8", APIVersion:"v1", ResourceVersion:"1050", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-wkk8d
I0114 16:01:24.503082   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017680-27368", Name:"busybox1", UID:"f706ac4b-f4ee-4b25-9b16-5c4798aad3e5", APIVersion:"v1", ResourceVersion:"1051", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-f6h4v
E0114 16:01:24.504364   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:24.595853   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:332: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:337: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 16:01:24.707739   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:338: Successful get rc busybox0 {{.spec.replicas}}: 1
(BE0114 16:01:24.811112   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:339: Successful get rc busybox1 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:344: Successful get hpa busybox0 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
(Bgeneric-resources.sh:345: Successful get hpa busybox1 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
(BSuccessful
message:horizontalpodautoscaler.autoscaling/busybox0 autoscaled
horizontalpodautoscaler.autoscaling/busybox1 autoscaled
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
horizontalpodautoscaler.autoscaling "busybox0" deleted
horizontalpodautoscaler.autoscaling "busybox1" deleted
generic-resources.sh:353: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:354: Successful get rc busybox0 {{.spec.replicas}}: 1
(BE0114 16:01:25.505545   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:355: Successful get rc busybox1 {{.spec.replicas}}: 1
(BE0114 16:01:25.597046   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:25.708729   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:359: Successful get service busybox0 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(Bgeneric-resources.sh:360: Successful get service busybox1 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(BSuccessful
message:service/busybox0 exposed
service/busybox1 exposed
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
E0114 16:01:25.812336   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:366: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:367: Successful get rc busybox0 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:368: Successful get rc busybox1 {{.spec.replicas}}: 1
(BI0114 16:01:26.159954   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017680-27368", Name:"busybox0", UID:"b1276733-ca1b-4678-988c-0da7d8e79ba8", APIVersion:"v1", ResourceVersion:"1072", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-p8gpp
I0114 16:01:26.171474   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017680-27368", Name:"busybox1", UID:"f706ac4b-f4ee-4b25-9b16-5c4798aad3e5", APIVersion:"v1", ResourceVersion:"1077", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-47nnm
generic-resources.sh:372: Successful get rc busybox0 {{.spec.replicas}}: 2
(Bgeneric-resources.sh:373: Successful get rc busybox1 {{.spec.replicas}}: 2
(BSuccessful
message:replicationcontroller/busybox0 scaled
replicationcontroller/busybox1 scaled
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
generic-resources.sh:378: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0114 16:01:26.506858   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:26.598265   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:382: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
replicationcontroller "busybox0" force deleted
replicationcontroller "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
E0114 16:01:26.709931   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:387: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:01:26.813442   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx1-deployment created
deployment.apps/nginx0-deployment created
error: error validating "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0114 16:01:26.930483   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017680-27368", Name:"nginx1-deployment", UID:"8554545e-ed3b-4154-8ca6-b8b0910e1ba7", APIVersion:"apps/v1", ResourceVersion:"1094", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx1-deployment-7bdbbfb5cf to 2
I0114 16:01:26.935245   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017680-27368", Name:"nginx1-deployment-7bdbbfb5cf", UID:"b23db44f-2260-49d7-a116-0798221d54d2", APIVersion:"apps/v1", ResourceVersion:"1096", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-7bdbbfb5cf-w7mxv
I0114 16:01:26.937166   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017680-27368", Name:"nginx0-deployment", UID:"4ddec59d-ac87-4727-a106-71b8ec776b93", APIVersion:"apps/v1", ResourceVersion:"1095", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx0-deployment-57c6bff7f6 to 2
I0114 16:01:26.937563   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017680-27368", Name:"nginx1-deployment-7bdbbfb5cf", UID:"b23db44f-2260-49d7-a116-0798221d54d2", APIVersion:"apps/v1", ResourceVersion:"1096", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-7bdbbfb5cf-5vwpq
I0114 16:01:26.942374   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017680-27368", Name:"nginx0-deployment-57c6bff7f6", UID:"31abc578-602a-4767-9000-ac96bace6119", APIVersion:"apps/v1", ResourceVersion:"1097", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-57c6bff7f6-c8pw7
I0114 16:01:26.946000   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017680-27368", Name:"nginx0-deployment-57c6bff7f6", UID:"31abc578-602a-4767-9000-ac96bace6119", APIVersion:"apps/v1", ResourceVersion:"1097", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-57c6bff7f6-j8hcs
generic-resources.sh:391: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx0-deployment:nginx1-deployment:
(Bgeneric-resources.sh:392: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
(Bgeneric-resources.sh:396: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
(BSuccessful
message:deployment.apps/nginx1-deployment skipped rollback (current template already matches revision 1)
deployment.apps/nginx0-deployment skipped rollback (current template already matches revision 1)
error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
deployment.apps/nginx1-deployment paused
deployment.apps/nginx0-deployment paused
E0114 16:01:27.508101   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:404: Successful get deployment {{range.items}}{{.spec.paused}}:{{end}}: true:true:
(BSuccessful
message:unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
E0114 16:01:27.600867   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx1-deployment resumed
deployment.apps/nginx0-deployment resumed
E0114 16:01:27.711048   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:410: Successful get deployment {{range.items}}{{.spec.paused}}:{{end}}: <no value>:<no value>:
(BSuccessful
message:unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
E0114 16:01:27.814704   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:deployment.apps/nginx1-deployment 
REVISION  CHANGE-CAUSE
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:nginx0-deployment
Successful
message:deployment.apps/nginx1-deployment 
REVISION  CHANGE-CAUSE
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:nginx1-deployment
Successful
message:deployment.apps/nginx1-deployment 
REVISION  CHANGE-CAUSE
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
deployment.apps "nginx1-deployment" force deleted
deployment.apps "nginx0-deployment" force deleted
error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
E0114 16:01:28.509411   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:28.602164   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:28.712259   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:28.815911   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:426: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/busybox0 created
I0114 16:01:29.255201   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017680-27368", Name:"busybox0", UID:"d60e03d4-7f99-40ec-82c9-cd2b0f1253bb", APIVersion:"v1", ResourceVersion:"1144", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-4lnpp
replicationcontroller/busybox1 created
error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0114 16:01:29.260053   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017680-27368", Name:"busybox1", UID:"40e46ce5-3bdc-4879-bef0-e5081cc07278", APIVersion:"v1", ResourceVersion:"1146", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-br42h
generic-resources.sh:430: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:no rollbacker has been implemented for "ReplicationController"
no rollbacker has been implemented for "ReplicationController"
unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:no rollbacker has been implemented for "ReplicationController"
Successful
message:no rollbacker has been implemented for "ReplicationController"
no rollbacker has been implemented for "ReplicationController"
unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
E0114 16:01:29.510593   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:replicationcontrollers "busybox0" pausing is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:replicationcontrollers "busybox1" pausing is not supported
E0114 16:01:29.603284   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:replicationcontrollers "busybox0" resuming is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:replicationcontrollers "busybox1" resuming is not supported
E0114 16:01:29.713574   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
replicationcontroller "busybox0" force deleted
replicationcontroller "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
E0114 16:01:29.817077   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:30.511808   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:30.604584   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:30.714772   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_namespace_tests
Running command: run_namespace_tests

+++ Running case: test-cmd.run_namespace_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_namespace_tests
+++ [0114 16:01:30] Testing kubectl(v1:namespaces)
E0114 16:01:30.818304   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/my-namespace created
core.sh:1314: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
(Bnamespace "my-namespace" deleted
E0114 16:01:31.513131   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:31.605904   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:31.715956   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:31.819518   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:32.514349   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:32.607191   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:32.717100   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:32.820787   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:33.515603   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:33.608420   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:33.718430   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:33.822107   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:34.516727   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:34.609716   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:34.719649   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:34.823441   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:35.517910   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:35.610923   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:35.720908   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:35.824576   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/my-namespace condition met
Successful
message:Error from server (NotFound): namespaces "my-namespace" not found
has: not found
namespace/my-namespace created
core.sh:1323: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
(BE0114 16:01:36.518985   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
namespace "kube-node-lease" deleted
namespace "my-namespace" deleted
namespace "namespace-1579017545-30713" deleted
namespace "namespace-1579017548-17342" deleted
... skipping 26 lines ...
namespace "namespace-1579017639-28337" deleted
namespace "namespace-1579017640-16188" deleted
namespace "namespace-1579017642-31920" deleted
namespace "namespace-1579017643-14320" deleted
namespace "namespace-1579017679-4671" deleted
namespace "namespace-1579017680-27368" deleted
Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
has:warning: deleting cluster-scoped resources
Successful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
namespace "kube-node-lease" deleted
namespace "my-namespace" deleted
namespace "namespace-1579017545-30713" deleted
... skipping 27 lines ...
namespace "namespace-1579017639-28337" deleted
namespace "namespace-1579017640-16188" deleted
namespace "namespace-1579017642-31920" deleted
namespace "namespace-1579017643-14320" deleted
namespace "namespace-1579017679-4671" deleted
namespace "namespace-1579017680-27368" deleted
Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
has:namespace "my-namespace" deleted
E0114 16:01:36.612040   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1335: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"other\" }}found{{end}}{{end}}:: :
(BE0114 16:01:36.722208   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/other created
E0114 16:01:36.825828   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1339: Successful get namespaces/other {{.metadata.name}}: other
(Bcore.sh:1343: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/valid-pod created
core.sh:1347: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:1349: Successful get pods -n other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BSuccessful
message:error: a resource cannot be retrieved by name across all namespaces
has:a resource cannot be retrieved by name across all namespaces
core.sh:1356: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BE0114 16:01:37.520330   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
E0114 16:01:37.613242   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 16:01:37.623203   54489 shared_informer.go:206] Waiting for caches to sync for resource quota
I0114 16:01:37.623244   54489 shared_informer.go:213] Caches are synced for resource quota 
core.sh:1360: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:01:37.723959   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace "other" deleted
E0114 16:01:37.827020   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 16:01:38.137715   54489 shared_informer.go:206] Waiting for caches to sync for garbage collector
I0114 16:01:38.137779   54489 shared_informer.go:213] Caches are synced for garbage collector 
E0114 16:01:38.521608   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:38.614611   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:38.725162   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:38.828184   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:39.522925   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:39.615960   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:39.726544   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:39.829544   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 16:01:39.933107   54489 horizontal.go:353] Horizontal Pod Autoscaler busybox0 has been deleted in namespace-1579017680-27368
I0114 16:01:39.936698   54489 horizontal.go:353] Horizontal Pod Autoscaler busybox1 has been deleted in namespace-1579017680-27368
E0114 16:01:40.524182   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:40.617246   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:40.727777   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:40.830706   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:41.525832   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:41.618509   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:41.728520   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:41.833165   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:42.527063   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:42.619808   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:42.729828   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:42.834020   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_secrets_test
Running command: run_secrets_test

+++ Running case: test-cmd.run_secrets_test 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
... skipping 40 lines ...
  name: test
has not:example.com
core.sh:725: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-secrets\" }}found{{end}}{{end}}:: :
(Bnamespace/test-secrets created
core.sh:729: Successful get namespaces/test-secrets {{.metadata.name}}: test-secrets
(Bcore.sh:733: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:01:43.528203   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret/test-secret created
E0114 16:01:43.621153   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:737: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(BE0114 16:01:43.731065   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:738: Successful get secret/test-secret --namespace=test-secrets {{.type}}: test-type
(BE0114 16:01:43.835244   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret "test-secret" deleted
core.sh:748: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(Bsecret/test-secret created
core.sh:752: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(Bcore.sh:753: Successful get secret/test-secret --namespace=test-secrets {{.type}}: kubernetes.io/dockerconfigjson
(Bsecret "test-secret" deleted
core.sh:763: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:01:44.529343   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret/test-secret created
E0114 16:01:44.622402   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:766: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(BE0114 16:01:44.732160   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:767: Successful get secret/test-secret --namespace=test-secrets {{.type}}: kubernetes.io/tls
(Bsecret "test-secret" deleted
E0114 16:01:44.836321   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret/test-secret created
core.sh:773: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(Bcore.sh:774: Successful get secret/test-secret --namespace=test-secrets {{.type}}: kubernetes.io/tls
(Bsecret "test-secret" deleted
secret/secret-string-data created
core.sh:796: Successful get secret/secret-string-data --namespace=test-secrets  {{.data}}: map[k1:djE= k2:djI=]
(Bcore.sh:797: Successful get secret/secret-string-data --namespace=test-secrets  {{.data}}: map[k1:djE= k2:djI=]
(BE0114 16:01:45.530659   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:798: Successful get secret/secret-string-data --namespace=test-secrets  {{.stringData}}: <no value>
(BE0114 16:01:45.623622   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret "secret-string-data" deleted
core.sh:807: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:01:45.733296   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:45.837839   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret "test-secret" deleted
namespace "test-secrets" deleted
I0114 16:01:46.186708   54489 namespace_controller.go:185] Namespace has been deleted my-namespace
E0114 16:01:46.532040   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 16:01:46.622352   54489 namespace_controller.go:185] Namespace has been deleted kube-node-lease
E0114 16:01:46.624943   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 16:01:46.648662   54489 namespace_controller.go:185] Namespace has been deleted namespace-1579017548-17342
I0114 16:01:46.654996   54489 namespace_controller.go:185] Namespace has been deleted namespace-1579017562-115
I0114 16:01:46.657204   54489 namespace_controller.go:185] Namespace has been deleted namespace-1579017562-19965
I0114 16:01:46.661061   54489 namespace_controller.go:185] Namespace has been deleted namespace-1579017545-30713
I0114 16:01:46.680857   54489 namespace_controller.go:185] Namespace has been deleted namespace-1579017552-9291
I0114 16:01:46.688746   54489 namespace_controller.go:185] Namespace has been deleted namespace-1579017567-24379
I0114 16:01:46.688748   54489 namespace_controller.go:185] Namespace has been deleted namespace-1579017566-18866
I0114 16:01:46.692043   54489 namespace_controller.go:185] Namespace has been deleted namespace-1579017565-9784
E0114 16:01:46.734678   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 16:01:46.738138   54489 namespace_controller.go:185] Namespace has been deleted namespace-1579017558-31598
I0114 16:01:46.816091   54489 namespace_controller.go:185] Namespace has been deleted namespace-1579017576-9239
E0114 16:01:46.838996   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 16:01:46.855572   54489 namespace_controller.go:185] Namespace has been deleted namespace-1579017588-25094
I0114 16:01:46.861744   54489 namespace_controller.go:185] Namespace has been deleted namespace-1579017589-9223
I0114 16:01:46.867868   54489 namespace_controller.go:185] Namespace has been deleted namespace-1579017577-30246
I0114 16:01:46.871156   54489 namespace_controller.go:185] Namespace has been deleted namespace-1579017595-23687
I0114 16:01:46.871329   54489 namespace_controller.go:185] Namespace has been deleted namespace-1579017591-6282
I0114 16:01:46.871424   54489 namespace_controller.go:185] Namespace has been deleted namespace-1579017592-4763
... skipping 12 lines ...
I0114 16:01:47.174066   54489 namespace_controller.go:185] Namespace has been deleted namespace-1579017639-28337
I0114 16:01:47.202673   54489 namespace_controller.go:185] Namespace has been deleted namespace-1579017640-16188
I0114 16:01:47.232623   54489 namespace_controller.go:185] Namespace has been deleted namespace-1579017642-31920
I0114 16:01:47.243844   54489 namespace_controller.go:185] Namespace has been deleted namespace-1579017679-4671
I0114 16:01:47.245358   54489 namespace_controller.go:185] Namespace has been deleted namespace-1579017643-14320
I0114 16:01:47.295489   54489 namespace_controller.go:185] Namespace has been deleted namespace-1579017680-27368
E0114 16:01:47.533199   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:47.626140   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:47.735840   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:47.840309   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 16:01:47.843099   54489 namespace_controller.go:185] Namespace has been deleted other
E0114 16:01:48.535219   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:48.627747   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:48.737526   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:48.842400   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:49.536409   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:49.629146   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:49.738784   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:49.843693   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:50.537639   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:50.630412   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:50.739959   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:50.844880   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_configmap_tests
Running command: run_configmap_tests

+++ Running case: test-cmd.run_configmap_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_configmap_tests
+++ [0114 16:01:51] Creating namespace namespace-1579017711-5768
namespace/namespace-1579017711-5768 created
Context "test" modified.
+++ [0114 16:01:51] Testing configmaps
configmap/test-configmap created
E0114 16:01:51.538824   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:28: Successful get configmap/test-configmap {{.metadata.name}}: test-configmap
(BE0114 16:01:51.631714   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
configmap "test-configmap" deleted
E0114 16:01:51.741079   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:33: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-configmaps\" }}found{{end}}{{end}}:: :
(Bnamespace/test-configmaps created
E0114 16:01:51.846145   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:37: Successful get namespaces/test-configmaps {{.metadata.name}}: test-configmaps
(Bcore.sh:41: Successful get configmaps {{range.items}}{{ if eq .metadata.name \"test-configmap\" }}found{{end}}{{end}}:: :
(Bcore.sh:42: Successful get configmaps {{range.items}}{{ if eq .metadata.name \"test-binary-configmap\" }}found{{end}}{{end}}:: :
(Bconfigmap/test-configmap created
configmap/test-binary-configmap created
core.sh:48: Successful get configmap/test-configmap --namespace=test-configmaps {{.metadata.name}}: test-configmap
(Bcore.sh:49: Successful get configmap/test-binary-configmap --namespace=test-configmaps {{.metadata.name}}: test-binary-configmap
(BE0114 16:01:52.540556   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
configmap "test-configmap" deleted
E0114 16:01:52.632862   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
configmap "test-binary-configmap" deleted
E0114 16:01:52.742513   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace "test-configmaps" deleted
E0114 16:01:52.847351   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:53.541859   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:53.634310   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:53.743746   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:53.848665   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:54.543096   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:54.635572   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:54.744918   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:54.849943   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:55.544288   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:55.636883   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:55.746072   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:55.851301   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 16:01:56.038326   54489 namespace_controller.go:185] Namespace has been deleted test-secrets
E0114 16:01:56.545458   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:56.638153   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:56.747245   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:56.852611   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:57.546728   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:57.639434   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:57.748428   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:57.853437   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_client_config_tests
Running command: run_client_config_tests

+++ Running case: test-cmd.run_client_config_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_client_config_tests
+++ [0114 16:01:57] Creating namespace namespace-1579017717-22649
namespace/namespace-1579017717-22649 created
Context "test" modified.
+++ [0114 16:01:58] Testing client config
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
Successful
message:Error in configuration: context was not found for specified context: missing-context
has:context was not found for specified context: missing-context
Successful
message:error: no server found for cluster "missing-cluster"
has:no server found for cluster "missing-cluster"
Successful
message:error: auth info "missing-user" does not exist
has:auth info "missing-user" does not exist
E0114 16:01:58.547891   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:58.640588   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: error loading config file "/tmp/newconfig.yaml": no kind "Config" is registered for version "v-1" in scheme "k8s.io/client-go/tools/clientcmd/api/latest/latest.go:50"
has:error loading config file
Successful
message:error: stat missing-config: no such file or directory
has:no such file or directory
E0114 16:01:58.749644   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_service_accounts_tests
Running command: run_service_accounts_tests

+++ Running case: test-cmd.run_service_accounts_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_service_accounts_tests
+++ [0114 16:01:58] Creating namespace namespace-1579017718-20296
E0114 16:01:58.854603   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1579017718-20296 created
Context "test" modified.
+++ [0114 16:01:58] Testing service accounts
core.sh:828: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-service-accounts\" }}found{{end}}{{end}}:: :
(Bnamespace/test-service-accounts created
core.sh:832: Successful get namespaces/test-service-accounts {{.metadata.name}}: test-service-accounts
(Bserviceaccount/test-service-account created
core.sh:838: Successful get serviceaccount/test-service-account --namespace=test-service-accounts {{.metadata.name}}: test-service-account
(Bserviceaccount "test-service-account" deleted
E0114 16:01:59.549132   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace "test-service-accounts" deleted
E0114 16:01:59.641804   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:59.750860   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:01:59.855849   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:00.550414   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:00.643027   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:00.752038   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:00.856984   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:01.551702   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:01.644416   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:01.753152   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:01.858371   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:02.552906   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:02.645737   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:02.754467   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:02.859603   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 16:02:02.860223   54489 namespace_controller.go:185] Namespace has been deleted test-configmaps
E0114 16:02:03.554169   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:03.646955   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:03.755703   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:03.860897   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:04.555379   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:04.647762   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
E0114 16:02:04.756937   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_job_tests
Running command: run_job_tests

+++ Running case: test-cmd.run_job_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_job_tests
+++ [0114 16:02:04] Creating namespace namespace-1579017724-32518
E0114 16:02:04.862249   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1579017724-32518 created
Context "test" modified.
+++ [0114 16:02:04] Testing job
batch.sh:30: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-jobs\" }}found{{end}}{{end}}:: :
(Bnamespace/test-jobs created
batch.sh:34: Successful get namespaces/test-jobs {{.metadata.name}}: test-jobs
... skipping 7 lines ...
Labels:                        run=pi
Annotations:                   <none>
Schedule:                      59 23 31 2 *
Concurrency Policy:            Allow
Suspend:                       False
Successful Job History Limit:  3
Failed Job History Limit:      1
Starting Deadline Seconds:     <unset>
Selector:                      <unset>
Parallelism:                   <unset>
Completions:                   <unset>
Pod Template:
  Labels:  run=pi
... skipping 13 lines ...
    Environment:     <none>
    Mounts:          <none>
  Volumes:           <none>
Last Schedule Time:  <unset>
Active Jobs:         <none>
Events:              <none>
E0114 16:02:05.556558   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:job.batch/test-job
has:job.batch/test-job
E0114 16:02:05.649055   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
batch.sh:48: Successful get jobs {{range.items}}{{.metadata.name}}{{end}}: 
(BE0114 16:02:05.758109   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 16:02:05.777744   54489 event.go:278] Event(v1.ObjectReference{Kind:"Job", Namespace:"test-jobs", Name:"test-job", UID:"d4e8b47b-a3f5-46d9-a804-4ccb73ffd2c6", APIVersion:"batch/v1", ResourceVersion:"1484", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-job-725pd
job.batch/test-job created
E0114 16:02:05.863222   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
batch.sh:53: Successful get job/test-job --namespace=test-jobs {{.metadata.name}}: test-job
(BNAME       COMPLETIONS   DURATION   AGE
test-job   0/1           0s         0s
Name:           test-job
Namespace:      test-jobs
Selector:       controller-uid=d4e8b47b-a3f5-46d9-a804-4ccb73ffd2c6
... skipping 2 lines ...
                run=pi
Annotations:    cronjob.kubernetes.io/instantiate: manual
Controlled By:  CronJob/pi
Parallelism:    1
Completions:    1
Start Time:     Tue, 14 Jan 2020 16:02:05 +0000
Pods Statuses:  1 Running / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  controller-uid=d4e8b47b-a3f5-46d9-a804-4ccb73ffd2c6
           job-name=test-job
           run=pi
  Containers:
   pi:
... skipping 15 lines ...
  Type    Reason            Age   From            Message
  ----    ------            ----  ----            -------
  Normal  SuccessfulCreate  1s    job-controller  Created pod: test-job-725pd
job.batch "test-job" deleted
cronjob.batch "pi" deleted
namespace "test-jobs" deleted
E0114 16:02:06.557890   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:06.650485   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:06.759384   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:06.864530   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:07.559320   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:07.651682   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:07.760625   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:07.865802   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:08.560455   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:08.652882   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:08.761889   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:08.867015   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:09.561784   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:09.654068   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 16:02:09.699067   54489 namespace_controller.go:185] Namespace has been deleted test-service-accounts
E0114 16:02:09.763300   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:09.868373   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:10.563068   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:10.655368   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:10.764593   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:10.869698   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_create_job_tests
Running command: run_create_job_tests

+++ Running case: test-cmd.run_create_job_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_create_job_tests
+++ [0114 16:02:11] Creating namespace namespace-1579017731-29167
namespace/namespace-1579017731-29167 created
E0114 16:02:11.564290   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
E0114 16:02:11.656502   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 16:02:11.681119   54489 event.go:278] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1579017731-29167", Name:"test-job", UID:"d03cce81-8047-4898-bfb2-3e087f8f7f67", APIVersion:"batch/v1", ResourceVersion:"1506", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-job-vqtgq
job.batch/test-job created
E0114 16:02:11.766004   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
create.sh:86: Successful get job test-job {{(index .spec.template.spec.containers 0).image}}: k8s.gcr.io/nginx:test-cmd
(Bjob.batch "test-job" deleted
E0114 16:02:11.870836   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 16:02:11.934204   54489 event.go:278] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1579017731-29167", Name:"test-job-pi", UID:"cfe5377c-1817-4c4f-9875-604c5173e995", APIVersion:"batch/v1", ResourceVersion:"1513", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-job-pi-hcr78
job.batch/test-job-pi created
create.sh:92: Successful get job test-job-pi {{(index .spec.template.spec.containers 0).image}}: k8s.gcr.io/perl
(Bjob.batch "test-job-pi" deleted
kubectl run --generator=cronjob/v1beta1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
cronjob.batch/test-pi created
... skipping 4 lines ...
has:perl -Mbignum=bpi -wle print bpi(10)
job.batch "my-pi" deleted
cronjob.batch "test-pi" deleted
+++ exit code: 0
Recording: run_pod_templates_tests
Running command: run_pod_templates_tests
E0114 16:02:12.565604   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource

+++ Running case: test-cmd.run_pod_templates_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_pod_templates_tests
+++ [0114 16:02:12] Creating namespace namespace-1579017732-23306
E0114 16:02:12.657805   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1579017732-23306 created
Context "test" modified.
+++ [0114 16:02:12] Testing pod templates
E0114 16:02:12.767249   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1421: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:02:12.872087   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 16:02:12.989803   51025 controller.go:606] quota admission added evaluator for: podtemplates
podtemplate/nginx created
core.sh:1425: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(BNAME    CONTAINERS   IMAGES   POD LABELS
nginx   nginx        nginx    name=nginx
core.sh:1433: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(Bpodtemplate "nginx" deleted
core.sh:1437: Successful get podtemplate {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
Recording: run_service_tests
Running command: run_service_tests
E0114 16:02:13.566777   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource

+++ Running case: test-cmd.run_service_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_service_tests
Context "test" modified.
E0114 16:02:13.658935   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ [0114 16:02:13] Testing kubectl(v1:services)
core.sh:858: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BE0114 16:02:13.768522   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:13.873238   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/redis-master created
core.sh:862: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(Bmatched Name:
matched Labels:
matched Selector:
matched IP:
... skipping 96 lines ...
IP:                10.0.0.81
Port:              <unset>  6379/TCP
TargetPort:        6379/TCP
Endpoints:         <none>
Session Affinity:  None
Events:            <none>
(BE0114 16:02:14.568072   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:              kubernetes
Namespace:         default
Labels:            component=apiserver
                   provider=kubernetes
Annotations:       <none>
... skipping 18 lines ...
IP:                10.0.0.81
Port:              <unset>  6379/TCP
TargetPort:        6379/TCP
Endpoints:         <none>
Session Affinity:  None
Events:            <none>
(BE0114 16:02:14.660182   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:              kubernetes
Namespace:         default
Labels:            component=apiserver
                   provider=kubernetes
Annotations:       <none>
... skipping 16 lines ...
Type:              ClusterIP
IP:                10.0.0.81
Port:              <unset>  6379/TCP
TargetPort:        6379/TCP
Endpoints:         <none>
Session Affinity:  None
(BE0114 16:02:14.769648   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:              kubernetes
Namespace:         default
Labels:            component=apiserver
                   provider=kubernetes
Annotations:       <none>
... skipping 18 lines ...
IP:                10.0.0.81
Port:              <unset>  6379/TCP
TargetPort:        6379/TCP
Endpoints:         <none>
Session Affinity:  None
Events:            <none>
(BE0114 16:02:14.874517   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:882: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend:
(BapiVersion: v1
kind: Service
metadata:
  creationTimestamp: null
  labels:
... skipping 116 lines ...
  selector:
    role: padawan
  sessionAffinity: None
  type: ClusterIP
status:
  loadBalancer: {}
error: you must specify resources by --filename when --local is set.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
E0114 16:02:15.569202   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:898: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend:
(BE0114 16:02:15.661337   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/redis-master selector updated
E0114 16:02:15.771009   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Error from server (Conflict): Operation cannot be fulfilled on services "redis-master": the object has been modified; please apply your changes to the latest version and try again
has:Conflict
E0114 16:02:15.875616   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:911: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(Bservice "redis-master" deleted
core.sh:918: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bcore.sh:922: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BI0114 16:02:16.367291   54489 namespace_controller.go:185] Namespace has been deleted test-jobs
service/redis-master created
E0114 16:02:16.570277   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:926: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(BE0114 16:02:16.662631   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:930: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(BE0114 16:02:16.772319   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/service-v1-test created
E0114 16:02:16.876777   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:951: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:service-v1-test:
(Bservice/service-v1-test replaced
core.sh:958: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:service-v1-test:
(Bservice "redis-master" deleted
service "service-v1-test" deleted
core.sh:966: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BE0114 16:02:17.571611   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:970: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BE0114 16:02:17.663865   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:17.773694   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/redis-master created
E0114 16:02:17.878003   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/redis-slave created
core.sh:975: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:redis-slave:
(BSuccessful
message:NAME           RSRC
kubernetes     144
redis-master   1559
redis-slave    1562
has:redis-master
core.sh:985: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:redis-slave:
(Bservice "redis-master" deleted
service "redis-slave" deleted
core.sh:992: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bcore.sh:996: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BE0114 16:02:18.572843   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/beep-boop created
E0114 16:02:18.664976   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1000: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: beep-boop:kubernetes:
(BE0114 16:02:18.774877   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1004: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: beep-boop:kubernetes:
(BE0114 16:02:18.879143   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "beep-boop" deleted
core.sh:1011: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bcore.sh:1015: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bkubectl run --generator=deployment/apps.v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
I0114 16:02:19.194956   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"default", Name:"testmetadata", UID:"b867026f-4fa3-4b92-addb-caad872afeb8", APIVersion:"apps/v1", ResourceVersion:"1576", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set testmetadata-bd968f46 to 2
I0114 16:02:19.203178   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"default", Name:"testmetadata-bd968f46", UID:"7853c88e-1847-43fd-b1f7-1e75d2a2ffed", APIVersion:"apps/v1", ResourceVersion:"1577", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: testmetadata-bd968f46-tj5wh
I0114 16:02:19.206591   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"default", Name:"testmetadata-bd968f46", UID:"7853c88e-1847-43fd-b1f7-1e75d2a2ffed", APIVersion:"apps/v1", ResourceVersion:"1577", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: testmetadata-bd968f46-htjb6
service/testmetadata created
deployment.apps/testmetadata created
core.sh:1019: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: testmetadata:
(Bcore.sh:1020: Successful get service testmetadata {{.metadata.annotations}}: map[zone-context:home]
(Bservice/exposemetadata exposed
E0114 16:02:19.574073   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1026: Successful get service exposemetadata {{.metadata.annotations}}: map[zone-context:work]
(BE0114 16:02:19.666239   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "exposemetadata" deleted
service "testmetadata" deleted
E0114 16:02:19.775984   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "testmetadata" deleted
+++ exit code: 0
Recording: run_daemonset_tests
Running command: run_daemonset_tests
E0114 16:02:19.880312   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource

+++ Running case: test-cmd.run_daemonset_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_daemonset_tests
+++ [0114 16:02:19] Creating namespace namespace-1579017739-29118
namespace/namespace-1579017739-29118 created
... skipping 2 lines ...
apps.sh:30: Successful get daemonsets {{range.items}}{{.metadata.name}}:{{end}}: 
(BI0114 16:02:20.300646   51025 controller.go:606] quota admission added evaluator for: daemonsets.apps
daemonset.apps/bind created
I0114 16:02:20.311574   51025 controller.go:606] quota admission added evaluator for: controllerrevisions.apps
apps.sh:34: Successful get daemonsets bind {{.metadata.generation}}: 1
(Bdaemonset.apps/bind configured
E0114 16:02:20.575337   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:37: Successful get daemonsets bind {{.metadata.generation}}: 1
(BE0114 16:02:20.667486   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind image updated
E0114 16:02:20.777143   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:40: Successful get daemonsets bind {{.metadata.generation}}: 2
(BE0114 16:02:20.881545   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind env updated
apps.sh:42: Successful get daemonsets bind {{.metadata.generation}}: 3
(Bdaemonset.apps/bind resource requirements updated
apps.sh:44: Successful get daemonsets bind {{.metadata.generation}}: 4
(Bdaemonset.apps/bind restarted
apps.sh:48: Successful get daemonsets bind {{.metadata.generation}}: 5
... skipping 3 lines ...
Running command: run_daemonset_history_tests

+++ Running case: test-cmd.run_daemonset_history_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_daemonset_history_tests
+++ [0114 16:02:21] Creating namespace namespace-1579017741-27424
E0114 16:02:21.576565   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1579017741-27424 created
E0114 16:02:21.668662   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0114 16:02:21] Testing kubectl(v1:daemonsets, v1:controllerrevisions)
E0114 16:02:21.778287   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:66: Successful get daemonsets {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:02:21.882726   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind created
apps.sh:70: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[deprecated.daemonset.template.generation:1 kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"DaemonSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"service":"bind"},"name":"bind","namespace":"namespace-1579017741-27424"},"spec":{"selector":{"matchLabels":{"service":"bind"}},"template":{"metadata":{"labels":{"service":"bind"}},"spec":{"affinity":{"podAntiAffinity":{"requiredDuringSchedulingIgnoredDuringExecution":[{"labelSelector":{"matchExpressions":[{"key":"service","operator":"In","values":["bind"]}]},"namespaces":[],"topologyKey":"kubernetes.io/hostname"}]}},"containers":[{"image":"k8s.gcr.io/pause:2.0","name":"kubernetes-pause"}]}},"updateStrategy":{"rollingUpdate":{"maxUnavailable":"10%"},"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
(Bdaemonset.apps/bind skipped rollback (current template already matches revision 1)
apps.sh:73: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:74: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(Bdaemonset.apps/bind configured
E0114 16:02:22.577601   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:77: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
(Bapps.sh:78: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0114 16:02:22.669964   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:79: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(BE0114 16:02:22.779530   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:80: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[deprecated.daemonset.template.generation:1 kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"DaemonSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"service":"bind"},"name":"bind","namespace":"namespace-1579017741-27424"},"spec":{"selector":{"matchLabels":{"service":"bind"}},"template":{"metadata":{"labels":{"service":"bind"}},"spec":{"affinity":{"podAntiAffinity":{"requiredDuringSchedulingIgnoredDuringExecution":[{"labelSelector":{"matchExpressions":[{"key":"service","operator":"In","values":["bind"]}]},"namespaces":[],"topologyKey":"kubernetes.io/hostname"}]}},"containers":[{"image":"k8s.gcr.io/pause:2.0","name":"kubernetes-pause"}]}},"updateStrategy":{"rollingUpdate":{"maxUnavailable":"10%"},"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:map[deprecated.daemonset.template.generation:2 kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"DaemonSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"service":"bind"},"name":"bind","namespace":"namespace-1579017741-27424"},"spec":{"selector":{"matchLabels":{"service":"bind"}},"template":{"metadata":{"labels":{"service":"bind"}},"spec":{"affinity":{"podAntiAffinity":{"requiredDuringSchedulingIgnoredDuringExecution":[{"labelSelector":{"matchExpressions":[{"key":"service","operator":"In","values":["bind"]}]},"namespaces":[],"topologyKey":"kubernetes.io/hostname"}]}},"containers":[{"image":"k8s.gcr.io/pause:latest","name":"kubernetes-pause"},{"image":"k8s.gcr.io/nginx:test-cmd","name":"app"}]}},"updateStrategy":{"rollingUpdate":{"maxUnavailable":"10%"},"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
(BE0114 16:02:22.883966   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind will roll back to Pod Template:
  Labels:	service=bind
  Containers:
   kubernetes-pause:
    Image:	k8s.gcr.io/pause:2.0
    Port:	<none>
... skipping 3 lines ...
  Volumes:	<none>
 (dry run)
apps.sh:83: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
(Bapps.sh:84: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:85: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bdaemonset.apps/bind rolled back
E0114 16:02:23.334590   54489 daemon_controller.go:291] namespace-1579017741-27424/bind failed with : error storing status for daemon set &v1.DaemonSet{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"bind", GenerateName:"", Namespace:"namespace-1579017741-27424", SelfLink:"/apis/apps/v1/namespaces/namespace-1579017741-27424/daemonsets/bind", UID:"1de030e8-7018-40e5-a730-a288f23478d3", ResourceVersion:"1645", Generation:3, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63714614541, loc:(*time.Location)(0x6b23a80)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string{"deprecated.daemonset.template.generation":"3", "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"DaemonSet\",\"metadata\":{\"annotations\":{\"kubernetes.io/change-cause\":\"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true\"},\"labels\":{\"service\":\"bind\"},\"name\":\"bind\",\"namespace\":\"namespace-1579017741-27424\"},\"spec\":{\"selector\":{\"matchLabels\":{\"service\":\"bind\"}},\"template\":{\"metadata\":{\"labels\":{\"service\":\"bind\"}},\"spec\":{\"affinity\":{\"podAntiAffinity\":{\"requiredDuringSchedulingIgnoredDuringExecution\":[{\"labelSelector\":{\"matchExpressions\":[{\"key\":\"service\",\"operator\":\"In\",\"values\":[\"bind\"]}]},\"namespaces\":[],\"topologyKey\":\"kubernetes.io/hostname\"}]}},\"containers\":[{\"image\":\"k8s.gcr.io/pause:latest\",\"name\":\"kubernetes-pause\"},{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"app\"}]}},\"updateStrategy\":{\"rollingUpdate\":{\"maxUnavailable\":\"10%\"},\"type\":\"RollingUpdate\"}}}\n", "kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"}, OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"kube-controller-manager", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc001cc99a0), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc001cc99e0)}, v1.ManagedFieldsEntry{Manager:"kubectl", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc001cc9a20), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc001cc9a40)}}}, Spec:v1.DaemonSetSpec{Selector:(*v1.LabelSelector)(0xc001cc9a60), Template:v1.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"kubernetes-pause", Image:"k8s.gcr.io/pause:2.0", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc002f2ac98), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc0022c4600), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(0xc001cc9a80), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration(nil), HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(nil), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}}, UpdateStrategy:v1.DaemonSetUpdateStrategy{Type:"RollingUpdate", RollingUpdate:(*v1.RollingUpdateDaemonSet)(0xc000f38fb0)}, MinReadySeconds:0, RevisionHistoryLimit:(*int32)(0xc002f2acec)}, Status:v1.DaemonSetStatus{CurrentNumberScheduled:0, NumberMisscheduled:0, DesiredNumberScheduled:0, NumberReady:0, ObservedGeneration:2, UpdatedNumberScheduled:0, NumberAvailable:0, NumberUnavailable:0, CollisionCount:(*int32)(nil), Conditions:[]v1.DaemonSetCondition(nil)}}: Operation cannot be fulfilled on daemonsets.apps "bind": the object has been modified; please apply your changes to the latest version and try again
apps.sh:88: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:89: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BE0114 16:02:23.578614   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: unable to find specified revision 1000000 in history
has:unable to find specified revision
E0114 16:02:23.671125   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:93: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:94: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BE0114 16:02:23.780734   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind rolled back
E0114 16:02:23.885115   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:23.889171   54489 daemon_controller.go:291] namespace-1579017741-27424/bind failed with : error storing status for daemon set &v1.DaemonSet{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"bind", GenerateName:"", Namespace:"namespace-1579017741-27424", SelfLink:"/apis/apps/v1/namespaces/namespace-1579017741-27424/daemonsets/bind", UID:"1de030e8-7018-40e5-a730-a288f23478d3", ResourceVersion:"1648", Generation:4, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63714614541, loc:(*time.Location)(0x6b23a80)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string{"deprecated.daemonset.template.generation":"4", "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"DaemonSet\",\"metadata\":{\"annotations\":{\"kubernetes.io/change-cause\":\"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true\"},\"labels\":{\"service\":\"bind\"},\"name\":\"bind\",\"namespace\":\"namespace-1579017741-27424\"},\"spec\":{\"selector\":{\"matchLabels\":{\"service\":\"bind\"}},\"template\":{\"metadata\":{\"labels\":{\"service\":\"bind\"}},\"spec\":{\"affinity\":{\"podAntiAffinity\":{\"requiredDuringSchedulingIgnoredDuringExecution\":[{\"labelSelector\":{\"matchExpressions\":[{\"key\":\"service\",\"operator\":\"In\",\"values\":[\"bind\"]}]},\"namespaces\":[],\"topologyKey\":\"kubernetes.io/hostname\"}]}},\"containers\":[{\"image\":\"k8s.gcr.io/pause:latest\",\"name\":\"kubernetes-pause\"},{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"app\"}]}},\"updateStrategy\":{\"rollingUpdate\":{\"maxUnavailable\":\"10%\"},\"type\":\"RollingUpdate\"}}}\n", "kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"}, OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"kube-controller-manager", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc001e4cfe0), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc001e4d020)}, v1.ManagedFieldsEntry{Manager:"kubectl", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc001e4d060), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc001e4d080)}}}, Spec:v1.DaemonSetSpec{Selector:(*v1.LabelSelector)(0xc001e4d0a0), Template:v1.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"kubernetes-pause", Image:"k8s.gcr.io/pause:latest", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}, v1.Container{Name:"app", Image:"k8s.gcr.io/nginx:test-cmd", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc002f631a8), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc002bb1980), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(0xc001e4d0c0), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration(nil), HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(nil), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}}, UpdateStrategy:v1.DaemonSetUpdateStrategy{Type:"RollingUpdate", RollingUpdate:(*v1.RollingUpdateDaemonSet)(0xc00040d360)}, MinReadySeconds:0, RevisionHistoryLimit:(*int32)(0xc002f631fc)}, Status:v1.DaemonSetStatus{CurrentNumberScheduled:0, NumberMisscheduled:0, DesiredNumberScheduled:0, NumberReady:0, ObservedGeneration:3, UpdatedNumberScheduled:0, NumberAvailable:0, NumberUnavailable:0, CollisionCount:(*int32)(nil), Conditions:[]v1.DaemonSetCondition(nil)}}: Operation cannot be fulfilled on daemonsets.apps "bind": the object has been modified; please apply your changes to the latest version and try again
apps.sh:97: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
(Bapps.sh:98: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:99: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bdaemonset.apps "bind" deleted
+++ exit code: 0
Recording: run_rc_tests
... skipping 4 lines ...
+++ command: run_rc_tests
+++ [0114 16:02:24] Creating namespace namespace-1579017744-24008
namespace/namespace-1579017744-24008 created
Context "test" modified.
+++ [0114 16:02:24] Testing kubectl(v1:replicationcontrollers)
core.sh:1052: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:02:24.593842   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:24.672259   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend created
I0114 16:02:24.740298   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017744-24008", Name:"frontend", UID:"ed482382-d51d-4653-aaec-5633c44b1794", APIVersion:"v1", ResourceVersion:"1656", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-8mdmp
I0114 16:02:24.743684   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017744-24008", Name:"frontend", UID:"ed482382-d51d-4653-aaec-5633c44b1794", APIVersion:"v1", ResourceVersion:"1656", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-5drb7
I0114 16:02:24.745596   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017744-24008", Name:"frontend", UID:"ed482382-d51d-4653-aaec-5633c44b1794", APIVersion:"v1", ResourceVersion:"1656", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-r2mh7
E0114 16:02:24.782057   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller "frontend" deleted
E0114 16:02:24.886311   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1057: Successful get pods -l "name=frontend" {{range.items}}{{.metadata.name}}:{{end}}: 
(Bcore.sh:1061: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/frontend created
I0114 16:02:25.165108   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017744-24008", Name:"frontend", UID:"aab69caa-1752-4e4c-ae91-64be69fb3ba1", APIVersion:"v1", ResourceVersion:"1674", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-nv2z2
I0114 16:02:25.168875   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017744-24008", Name:"frontend", UID:"aab69caa-1752-4e4c-ae91-64be69fb3ba1", APIVersion:"v1", ResourceVersion:"1674", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-97dqs
I0114 16:02:25.170532   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017744-24008", Name:"frontend", UID:"aab69caa-1752-4e4c-ae91-64be69fb3ba1", APIVersion:"v1", ResourceVersion:"1674", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-ngmcs
... skipping 11 lines ...
Namespace:    namespace-1579017744-24008
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
Namespace:    namespace-1579017744-24008
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 18 lines ...
Namespace:    namespace-1579017744-24008
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 4 lines ...
      memory:  100Mi
    Environment:
      GET_HOSTS_FROM:  dns
    Mounts:            <none>
  Volumes:             <none>
(B
E0114 16:02:25.595006   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:25.673224   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1073: Successful describe
Name:         frontend
Namespace:    namespace-1579017744-24008
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 12 lines ...
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: frontend-nv2z2
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: frontend-97dqs
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: frontend-ngmcs
(B
matched Name:
matched Name:
E0114 16:02:25.783151   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Pod Template:
matched Labels:
matched Selector:
matched Replicas:
matched Pods Status:
matched Volumes:
... skipping 3 lines ...
Namespace:    namespace-1579017744-24008
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 9 lines ...
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: frontend-nv2z2
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: frontend-97dqs
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: frontend-ngmcs
(BE0114 16:02:25.887390   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:         frontend
Namespace:    namespace-1579017744-24008
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
Namespace:    namespace-1579017744-24008
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 11 lines ...
Namespace:    namespace-1579017744-24008
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 15 lines ...
(Bcore.sh:1085: Successful get rc frontend {{.spec.replicas}}: 3
(Breplicationcontroller/frontend scaled
E0114 16:02:26.308843   54489 replica_set.go:199] ReplicaSet has no controller: &ReplicaSet{ObjectMeta:{frontend  namespace-1579017744-24008 /api/v1/namespaces/namespace-1579017744-24008/replicationcontrollers/frontend aab69caa-1752-4e4c-ae91-64be69fb3ba1 1683 2 2020-01-14 16:02:25 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  [{kubectl Update v1 2020-01-14 16:02:25 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 109 101 116 97 100 97 116 97 34 58 123 34 102 58 108 97 98 101 108 115 34 58 123 34 102 58 97 112 112 34 58 123 125 44 34 102 58 116 105 101 114 34 58 123 125 44 34 46 34 58 123 125 125 125 44 34 102 58 115 112 101 99 34 58 123 34 102 58 114 101 112 108 105 99 97 115 34 58 123 125 44 34 102 58 115 101 108 101 99 116 111 114 34 58 123 34 102 58 97 112 112 34 58 123 125 44 34 102 58 116 105 101 114 34 58 123 125 44 34 46 34 58 123 125 125 44 34 102 58 116 101 109 112 108 97 116 101 34 58 123 34 102 58 109 101 116 97 100 97 116 97 34 58 123 34 102 58 99 114 101 97 116 105 111 110 84 105 109 101 115 116 97 109 112 34 58 123 125 44 34 102 58 108 97 98 101 108 115 34 58 123 34 102 58 97 112 112 34 58 123 125 44 34 102 58 116 105 101 114 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 115 112 101 99 34 58 123 34 102 58 99 111 110 116 97 105 110 101 114 115 34 58 123 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 112 104 112 45 114 101 100 105 115 92 34 125 34 58 123 34 102 58 101 110 118 34 58 123 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 71 69 84 95 72 79 83 84 83 95 70 82 79 77 92 34 125 34 58 123 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 105 109 97 103 101 34 58 123 125 44 34 102 58 105 109 97 103 101 80 117 108 108 80 111 108 105 99 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 112 111 114 116 115 34 58 123 34 107 58 123 92 34 99 111 110 116 97 105 110 101 114 80 111 114 116 92 34 58 56 48 44 92 34 112 114 111 116 111 99 111 108 92 34 58 92 34 84 67 80 92 34 125 34 58 123 34 102 58 99 111 110 116 97 105 110 101 114 80 111 114 116 34 58 123 125 44 34 102 58 112 114 111 116 111 99 111 108 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 114 101 115 111 117 114 99 101 115 34 58 123 34 102 58 114 101 113 117 101 115 116 115 34 58 123 34 102 58 99 112 117 34 58 123 125 44 34 102 58 109 101 109 111 114 121 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 97 116 104 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 111 108 105 99 121 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 100 110 115 80 111 108 105 99 121 34 58 123 125 44 34 102 58 114 101 115 116 97 114 116 80 111 108 105 99 121 34 58 123 125 44 34 102 58 115 99 104 101 100 117 108 101 114 78 97 109 101 34 58 123 125 44 34 102 58 115 101 99 117 114 105 116 121 67 111 110 116 101 120 116 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 71 114 97 99 101 80 101 114 105 111 100 83 101 99 111 110 100 115 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 125 125],}} {kube-controller-manager Update v1 2020-01-14 16:02:25 +0000 UTC FieldsV1 &FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 102 117 108 108 121 76 97 98 101 108 101 100 82 101 112 108 105 99 97 115 34 58 123 125 44 34 102 58 111 98 115 101 114 118 101 100 71 101 110 101 114 97 116 105 111 110 34 58 123 125 44 34 102 58 114 101 112 108 105 99 97 115 34 58 123 125 125 125],}}]},Spec:ReplicaSetSpec{Replicas:*2,Selector:&v1.LabelSelector{MatchLabels:map[string]string{app: guestbook,tier: frontend,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{      0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []} {[] [] [{php-redis gcr.io/google_samples/gb-frontend:v4 [] []  [{ 0 80 TCP }] [] [{GET_HOSTS_FROM dns nil}] {map[] map[cpu:{{100 -3} {<nil>} 100m DecimalSI} memory:{{104857600 0} {<nil>} 100Mi BinarySI}]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent nil false false false}] [] Always 0xc002b72f08 <nil> ClusterFirst map[]   <nil>  false false false <nil> PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,} []   nil default-scheduler [] []  <nil> nil [] <nil> <nil> <nil> map[] []}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:3,FullyLabeledReplicas:3,ObservedGeneration:1,ReadyReplicas:0,AvailableReplicas:0,Conditions:[]ReplicaSetCondition{},},}
I0114 16:02:26.314292   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017744-24008", Name:"frontend", UID:"aab69caa-1752-4e4c-ae91-64be69fb3ba1", APIVersion:"v1", ResourceVersion:"1683", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-nv2z2
core.sh:1089: Successful get rc frontend {{.spec.replicas}}: 2
(Bcore.sh:1093: Successful get rc frontend {{.spec.replicas}}: 2
(BE0114 16:02:26.596036   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
error: Expected replicas to be 3, was 2
E0114 16:02:26.674417   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1097: Successful get rc frontend {{.spec.replicas}}: 2
(Bcore.sh:1101: Successful get rc frontend {{.spec.replicas}}: 2
(BE0114 16:02:26.784307   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend scaled
I0114 16:02:26.875583   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017744-24008", Name:"frontend", UID:"aab69caa-1752-4e4c-ae91-64be69fb3ba1", APIVersion:"v1", ResourceVersion:"1689", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-g4prj
E0114 16:02:26.888506   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1105: Successful get rc frontend {{.spec.replicas}}: 3
(Bcore.sh:1109: Successful get rc frontend {{.spec.replicas}}: 3
(BE0114 16:02:27.151532   54489 replica_set.go:199] ReplicaSet has no controller: &ReplicaSet{ObjectMeta:{frontend  namespace-1579017744-24008 /api/v1/namespaces/namespace-1579017744-24008/replicationcontrollers/frontend aab69caa-1752-4e4c-ae91-64be69fb3ba1 1694 4 2020-01-14 16:02:25 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  [{kubectl Update v1 2020-01-14 16:02:25 +0000 UTC FieldsV1 FieldsV1{Raw:*[123 34 102 58 109 101 116 97 100 97 116 97 34 58 123 34 102 58 108 97 98 101 108 115 34 58 123 34 102 58 97 112 112 34 58 123 125 44 34 102 58 116 105 101 114 34 58 123 125 44 34 46 34 58 123 125 125 125 44 34 102 58 115 112 101 99 34 58 123 34 102 58 114 101 112 108 105 99 97 115 34 58 123 125 44 34 102 58 115 101 108 101 99 116 111 114 34 58 123 34 102 58 97 112 112 34 58 123 125 44 34 102 58 116 105 101 114 34 58 123 125 44 34 46 34 58 123 125 125 44 34 102 58 116 101 109 112 108 97 116 101 34 58 123 34 102 58 109 101 116 97 100 97 116 97 34 58 123 34 102 58 99 114 101 97 116 105 111 110 84 105 109 101 115 116 97 109 112 34 58 123 125 44 34 102 58 108 97 98 101 108 115 34 58 123 34 102 58 97 112 112 34 58 123 125 44 34 102 58 116 105 101 114 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 115 112 101 99 34 58 123 34 102 58 99 111 110 116 97 105 110 101 114 115 34 58 123 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 112 104 112 45 114 101 100 105 115 92 34 125 34 58 123 34 102 58 101 110 118 34 58 123 34 107 58 123 92 34 110 97 109 101 92 34 58 92 34 71 69 84 95 72 79 83 84 83 95 70 82 79 77 92 34 125 34 58 123 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 118 97 108 117 101 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 105 109 97 103 101 34 58 123 125 44 34 102 58 105 109 97 103 101 80 117 108 108 80 111 108 105 99 121 34 58 123 125 44 34 102 58 110 97 109 101 34 58 123 125 44 34 102 58 112 111 114 116 115 34 58 123 34 107 58 123 92 34 99 111 110 116 97 105 110 101 114 80 111 114 116 92 34 58 56 48 44 92 34 112 114 111 116 111 99 111 108 92 34 58 92 34 84 67 80 92 34 125 34 58 123 34 102 58 99 111 110 116 97 105 110 101 114 80 111 114 116 34 58 123 125 44 34 102 58 112 114 111 116 111 99 111 108 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 114 101 115 111 117 114 99 101 115 34 58 123 34 102 58 114 101 113 117 101 115 116 115 34 58 123 34 102 58 99 112 117 34 58 123 125 44 34 102 58 109 101 109 111 114 121 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 97 116 104 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 77 101 115 115 97 103 101 80 111 108 105 99 121 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 44 34 102 58 100 110 115 80 111 108 105 99 121 34 58 123 125 44 34 102 58 114 101 115 116 97 114 116 80 111 108 105 99 121 34 58 123 125 44 34 102 58 115 99 104 101 100 117 108 101 114 78 97 109 101 34 58 123 125 44 34 102 58 115 101 99 117 114 105 116 121 67 111 110 116 101 120 116 34 58 123 125 44 34 102 58 116 101 114 109 105 110 97 116 105 111 110 71 114 97 99 101 80 101 114 105 111 100 83 101 99 111 110 100 115 34 58 123 125 44 34 46 34 58 123 125 125 44 34 46 34 58 123 125 125 125 125],}} {kube-controller-manager Update v1 2020-01-14 16:02:26 +0000 UTC FieldsV1 &FieldsV1{Raw:*[123 34 102 58 115 116 97 116 117 115 34 58 123 34 102 58 102 117 108 108 121 76 97 98 101 108 101 100 82 101 112 108 105 99 97 115 34 58 123 125 44 34 102 58 111 98 115 101 114 118 101 100 71 101 110 101 114 97 116 105 111 110 34 58 123 125 44 34 102 58 114 101 112 108 105 99 97 115 34 58 123 125 125 125],}}]},Spec:ReplicaSetSpec{Replicas:*2,Selector:&v1.LabelSelector{MatchLabels:map[string]string{app: guestbook,tier: frontend,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{      0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []} {[] [] [{php-redis gcr.io/google_samples/gb-frontend:v4 [] []  [{ 0 80 TCP }] [] [{GET_HOSTS_FROM dns nil}] {map[] map[cpu:{{100 -3} {<nil>} 100m DecimalSI} memory:{{104857600 0} {<nil>} 100Mi BinarySI}]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent nil false false false}] [] Always 0xc0028262f8 <nil> ClusterFirst map[]   <nil>  false false false <nil> PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,} []   nil default-scheduler [] []  <nil> nil [] <nil> <nil> <nil> map[] []}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:3,FullyLabeledReplicas:3,ObservedGeneration:3,ReadyReplicas:0,AvailableReplicas:0,Conditions:[]ReplicaSetCondition{},},}
replicationcontroller/frontend scaled
I0114 16:02:27.157154   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017744-24008", Name:"frontend", UID:"aab69caa-1752-4e4c-ae91-64be69fb3ba1", APIVersion:"v1", ResourceVersion:"1694", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-g4prj
core.sh:1113: Successful get rc frontend {{.spec.replicas}}: 2
(Breplicationcontroller "frontend" deleted
replicationcontroller/redis-master created
I0114 16:02:27.520024   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017744-24008", Name:"redis-master", UID:"5817a1b7-0922-466a-b4a4-e0b3773e8c37", APIVersion:"v1", ResourceVersion:"1709", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-9vdsq
E0114 16:02:27.597322   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:27.675695   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/redis-slave created
I0114 16:02:27.704027   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017744-24008", Name:"redis-slave", UID:"ee988022-0a30-44bc-bc6e-9caddfd2a1a9", APIVersion:"v1", ResourceVersion:"1714", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-pm7bg
I0114 16:02:27.708127   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017744-24008", Name:"redis-slave", UID:"ee988022-0a30-44bc-bc6e-9caddfd2a1a9", APIVersion:"v1", ResourceVersion:"1714", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-q297f
E0114 16:02:27.785610   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/redis-master scaled
I0114 16:02:27.798869   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017744-24008", Name:"redis-master", UID:"5817a1b7-0922-466a-b4a4-e0b3773e8c37", APIVersion:"v1", ResourceVersion:"1721", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-2g99k
replicationcontroller/redis-slave scaled
I0114 16:02:27.803027   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017744-24008", Name:"redis-master", UID:"5817a1b7-0922-466a-b4a4-e0b3773e8c37", APIVersion:"v1", ResourceVersion:"1721", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-gp9j5
I0114 16:02:27.803238   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017744-24008", Name:"redis-master", UID:"5817a1b7-0922-466a-b4a4-e0b3773e8c37", APIVersion:"v1", ResourceVersion:"1721", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-jmxhd
I0114 16:02:27.803560   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017744-24008", Name:"redis-slave", UID:"ee988022-0a30-44bc-bc6e-9caddfd2a1a9", APIVersion:"v1", ResourceVersion:"1723", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-s2nzg
I0114 16:02:27.807224   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017744-24008", Name:"redis-slave", UID:"ee988022-0a30-44bc-bc6e-9caddfd2a1a9", APIVersion:"v1", ResourceVersion:"1723", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-hfkbt
E0114 16:02:27.889604   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1123: Successful get rc redis-master {{.spec.replicas}}: 4
(Bcore.sh:1124: Successful get rc redis-slave {{.spec.replicas}}: 4
(Breplicationcontroller "redis-master" deleted
replicationcontroller "redis-slave" deleted
deployment.apps/nginx-deployment created
I0114 16:02:28.278014   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017744-24008", Name:"nginx-deployment", UID:"206b741c-0831-48e8-93f5-707c60ff0be7", APIVersion:"apps/v1", ResourceVersion:"1755", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
... skipping 3 lines ...
deployment.apps/nginx-deployment scaled
I0114 16:02:28.380392   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017744-24008", Name:"nginx-deployment", UID:"206b741c-0831-48e8-93f5-707c60ff0be7", APIVersion:"apps/v1", ResourceVersion:"1769", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-6986c7bc94 to 1
I0114 16:02:28.391318   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017744-24008", Name:"nginx-deployment-6986c7bc94", UID:"99ce42a6-1e6b-4461-b003-fca6de1ff2dc", APIVersion:"apps/v1", ResourceVersion:"1770", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-6986c7bc94-rwwfj
I0114 16:02:28.393307   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017744-24008", Name:"nginx-deployment-6986c7bc94", UID:"99ce42a6-1e6b-4461-b003-fca6de1ff2dc", APIVersion:"apps/v1", ResourceVersion:"1770", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-6986c7bc94-p54bp
core.sh:1133: Successful get deployment nginx-deployment {{.spec.replicas}}: 1
(Bdeployment.apps "nginx-deployment" deleted
E0114 16:02:28.598297   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:28.676952   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:service/expose-test-deployment exposed
has:service/expose-test-deployment exposed
E0114 16:02:28.786842   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "expose-test-deployment" deleted
E0114 16:02:28.890765   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: couldn't retrieve selectors via --selector flag or introspection: invalid deployment: no selectors, therefore cannot be exposed
See 'kubectl expose -h' for help and examples
has:invalid deployment: no selectors
deployment.apps/nginx-deployment created
I0114 16:02:29.058473   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017744-24008", Name:"nginx-deployment", UID:"4f4fc020-0c84-4ea7-b14e-489e4941f0f6", APIVersion:"apps/v1", ResourceVersion:"1793", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
I0114 16:02:29.063918   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017744-24008", Name:"nginx-deployment-6986c7bc94", UID:"26c99946-a2a2-44c4-83de-60428c6404fc", APIVersion:"apps/v1", ResourceVersion:"1794", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-rkktj
I0114 16:02:29.066990   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017744-24008", Name:"nginx-deployment-6986c7bc94", UID:"26c99946-a2a2-44c4-83de-60428c6404fc", APIVersion:"apps/v1", ResourceVersion:"1794", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-rn42g
I0114 16:02:29.068333   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017744-24008", Name:"nginx-deployment-6986c7bc94", UID:"26c99946-a2a2-44c4-83de-60428c6404fc", APIVersion:"apps/v1", ResourceVersion:"1794", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-xnbn9
core.sh:1152: Successful get deployment nginx-deployment {{.spec.replicas}}: 3
(Bservice/nginx-deployment exposed
core.sh:1156: Successful get service nginx-deployment {{(index .spec.ports 0).port}}: 80
(Bdeployment.apps "nginx-deployment" deleted
service "nginx-deployment" deleted
E0114 16:02:29.599367   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend created
I0114 16:02:29.637208   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017744-24008", Name:"frontend", UID:"6a7e2909-bc0d-47c8-9714-b1319dfad902", APIVersion:"v1", ResourceVersion:"1824", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-rzk94
I0114 16:02:29.640815   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017744-24008", Name:"frontend", UID:"6a7e2909-bc0d-47c8-9714-b1319dfad902", APIVersion:"v1", ResourceVersion:"1824", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-rmdmm
I0114 16:02:29.641357   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017744-24008", Name:"frontend", UID:"6a7e2909-bc0d-47c8-9714-b1319dfad902", APIVersion:"v1", ResourceVersion:"1824", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-lwd4l
E0114 16:02:29.678138   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1163: Successful get rc frontend {{.spec.replicas}}: 3
(BE0114 16:02:29.787981   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/frontend exposed
E0114 16:02:29.891955   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1167: Successful get service frontend {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(Bservice/frontend-2 exposed
core.sh:1171: Successful get service frontend-2 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 443
(Bpod/valid-pod created
service/frontend-3 exposed
core.sh:1176: Successful get service frontend-3 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 444
(Bservice/frontend-4 exposed
E0114 16:02:30.600542   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1180: Successful get service frontend-4 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: default 80
(BE0114 16:02:30.679345   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/frontend-5 exposed
E0114 16:02:30.789098   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1184: Successful get service frontend-5 {{(index .spec.ports 0).port}}: 80
(Bpod "valid-pod" deleted
E0114 16:02:30.893063   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "frontend" deleted
service "frontend-2" deleted
service "frontend-3" deleted
service "frontend-4" deleted
service "frontend-5" deleted
Successful
message:error: cannot expose a Node
has:cannot expose
Successful
message:The Service "invalid-large-service-name-that-has-more-than-sixty-three-characters" is invalid: metadata.name: Invalid value: "invalid-large-service-name-that-has-more-than-sixty-three-characters": must be no more than 63 characters
has:metadata.name: Invalid value
Successful
message:service/kubernetes-serve-hostname-testing-sixty-three-characters-in-len exposed
has:kubernetes-serve-hostname-testing-sixty-three-characters-in-len exposed
service "kubernetes-serve-hostname-testing-sixty-three-characters-in-len" deleted
Successful
message:service/etcd-server exposed
has:etcd-server exposed
core.sh:1214: Successful get service etcd-server {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: port-1 2380
(BE0114 16:02:31.601775   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1215: Successful get service etcd-server {{(index .spec.ports 1).name}} {{(index .spec.ports 1).port}}: port-2 2379
(BE0114 16:02:31.680566   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "etcd-server" deleted
E0114 16:02:31.790337   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1221: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(BE0114 16:02:31.894278   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller "frontend" deleted
core.sh:1225: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Bcore.sh:1229: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/frontend created
I0114 16:02:32.268835   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017744-24008", Name:"frontend", UID:"dca03809-7c6b-429d-b387-3f2eceff7715", APIVersion:"v1", ResourceVersion:"1887", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-njtng
I0114 16:02:32.271639   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017744-24008", Name:"frontend", UID:"dca03809-7c6b-429d-b387-3f2eceff7715", APIVersion:"v1", ResourceVersion:"1887", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-d82l4
I0114 16:02:32.274399   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017744-24008", Name:"frontend", UID:"dca03809-7c6b-429d-b387-3f2eceff7715", APIVersion:"v1", ResourceVersion:"1887", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-9nqp8
replicationcontroller/redis-slave created
I0114 16:02:32.438644   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017744-24008", Name:"redis-slave", UID:"6e9a64bf-d2c7-472a-937a-df37c4848b48", APIVersion:"v1", ResourceVersion:"1896", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-c82bk
I0114 16:02:32.443638   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017744-24008", Name:"redis-slave", UID:"6e9a64bf-d2c7-472a-937a-df37c4848b48", APIVersion:"v1", ResourceVersion:"1896", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-dl28c
core.sh:1234: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(BE0114 16:02:32.602837   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1238: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(BE0114 16:02:32.681781   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller "frontend" deleted
replicationcontroller "redis-slave" deleted
E0114 16:02:32.791467   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1242: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Bcore.sh:1246: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:02:32.895459   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend created
I0114 16:02:33.053804   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017744-24008", Name:"frontend", UID:"5f2856b9-0f63-4a80-814a-169defcca0e1", APIVersion:"v1", ResourceVersion:"1915", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-hn9lv
I0114 16:02:33.056838   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017744-24008", Name:"frontend", UID:"5f2856b9-0f63-4a80-814a-169defcca0e1", APIVersion:"v1", ResourceVersion:"1915", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-gdmbk
I0114 16:02:33.057464   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017744-24008", Name:"frontend", UID:"5f2856b9-0f63-4a80-814a-169defcca0e1", APIVersion:"v1", ResourceVersion:"1915", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-xkc6d
core.sh:1249: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Bhorizontalpodautoscaler.autoscaling/frontend autoscaled
core.sh:1252: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 70
(Bhorizontalpodautoscaler.autoscaling "frontend" deleted
horizontalpodautoscaler.autoscaling/frontend autoscaled
core.sh:1256: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(BE0114 16:02:33.604100   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
horizontalpodautoscaler.autoscaling "frontend" deleted
E0114 16:02:33.682940   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Error: required flag(s) "max" not set
replicationcontroller "frontend" deleted
E0114 16:02:33.792669   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:02:33.896723   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apiVersion: apps/v1
kind: Deployment
metadata:
  creationTimestamp: null
  labels:
    name: nginx-deployment-resources
... skipping 22 lines ...
          limits:
            cpu: 300m
          requests:
            cpu: 300m
      terminationGracePeriodSeconds: 0
status: {}
Error from server (NotFound): deployments.apps "nginx-deployment-resources" not found
deployment.apps/nginx-deployment-resources created
I0114 16:02:34.195769   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017744-24008", Name:"nginx-deployment-resources", UID:"3ee9e2ab-9ba9-472f-8596-fa147e3b020f", APIVersion:"apps/v1", ResourceVersion:"1937", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-67f8cfff5 to 3
I0114 16:02:34.199107   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017744-24008", Name:"nginx-deployment-resources-67f8cfff5", UID:"a81739dc-57fb-47f2-975d-9128c6aefd5f", APIVersion:"apps/v1", ResourceVersion:"1938", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-67f8cfff5-8q9pd
I0114 16:02:34.202544   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017744-24008", Name:"nginx-deployment-resources-67f8cfff5", UID:"a81739dc-57fb-47f2-975d-9128c6aefd5f", APIVersion:"apps/v1", ResourceVersion:"1938", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-67f8cfff5-hpmgl
I0114 16:02:34.202583   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017744-24008", Name:"nginx-deployment-resources-67f8cfff5", UID:"a81739dc-57fb-47f2-975d-9128c6aefd5f", APIVersion:"apps/v1", ResourceVersion:"1938", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-67f8cfff5-hhbv5
core.sh:1271: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment-resources:
(Bcore.sh:1272: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bcore.sh:1273: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment-resources resource requirements updated
I0114 16:02:34.559970   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017744-24008", Name:"nginx-deployment-resources", UID:"3ee9e2ab-9ba9-472f-8596-fa147e3b020f", APIVersion:"apps/v1", ResourceVersion:"1951", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-55c547f795 to 1
I0114 16:02:34.564963   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017744-24008", Name:"nginx-deployment-resources-55c547f795", UID:"45886581-43c9-430f-8ea6-abdebbdb6a26", APIVersion:"apps/v1", ResourceVersion:"1952", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-55c547f795-5tvd5
E0114 16:02:34.605250   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1276: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 100m:
(BE0114 16:02:34.684148   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1277: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m:
(BE0114 16:02:34.793973   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
error: unable to find container named redis
E0114 16:02:34.897808   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment-resources resource requirements updated
I0114 16:02:34.910295   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017744-24008", Name:"nginx-deployment-resources", UID:"3ee9e2ab-9ba9-472f-8596-fa147e3b020f", APIVersion:"apps/v1", ResourceVersion:"1961", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-resources-55c547f795 to 0
I0114 16:02:34.923756   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017744-24008", Name:"nginx-deployment-resources-55c547f795", UID:"45886581-43c9-430f-8ea6-abdebbdb6a26", APIVersion:"apps/v1", ResourceVersion:"1965", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-resources-55c547f795-5tvd5
I0114 16:02:34.927671   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017744-24008", Name:"nginx-deployment-resources", UID:"3ee9e2ab-9ba9-472f-8596-fa147e3b020f", APIVersion:"apps/v1", ResourceVersion:"1967", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-6d86564b45 to 1
I0114 16:02:34.932132   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017744-24008", Name:"nginx-deployment-resources-6d86564b45", UID:"6d598d05-cd9a-4b00-8247-b6f31f388951", APIVersion:"apps/v1", ResourceVersion:"1972", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-6d86564b45-pzhzx
core.sh:1282: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
... skipping 80 lines ...
    status: "True"
    type: Progressing
  observedGeneration: 4
  replicas: 4
  unavailableReplicas: 4
  updatedReplicas: 1
error: you must specify resources by --filename when --local is set.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
E0114 16:02:35.606515   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1292: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
(BE0114 16:02:35.685244   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1293: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 300m:
(BE0114 16:02:35.795205   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1294: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.requests.cpu}}:{{end}}: 300m:
(BE0114 16:02:35.899002   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx-deployment-resources" deleted
+++ exit code: 0
Recording: run_deployment_tests
Running command: run_deployment_tests

+++ Running case: test-cmd.run_deployment_tests 
... skipping 11 lines ...
message:10
has not:2
Successful
message:apps/v1
has:apps/v1
deployment.apps "test-nginx-extensions" deleted
E0114 16:02:36.607370   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/test-nginx-apps created
I0114 16:02:36.684133   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017756-14008", Name:"test-nginx-apps", UID:"ff910f55-7e12-4bdf-ab71-5a9c5a90da5a", APIVersion:"apps/v1", ResourceVersion:"2034", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-nginx-apps-79b9bd9585 to 1
E0114 16:02:36.686589   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 16:02:36.688170   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"test-nginx-apps-79b9bd9585", UID:"138190d5-c460-4bcd-8834-5140c11f89b6", APIVersion:"apps/v1", ResourceVersion:"2035", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-nginx-apps-79b9bd9585-wnxtv
apps.sh:198: Successful get deploy test-nginx-apps {{(index .spec.template.spec.containers 0).name}}: nginx
(BE0114 16:02:36.796331   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:10
has:10
E0114 16:02:36.900331   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:apps/v1
has:apps/v1
matched Name:
matched Pod Template:
matched Labels:
... skipping 10 lines ...
                pod-template-hash=79b9bd9585
Annotations:    deployment.kubernetes.io/desired-replicas: 1
                deployment.kubernetes.io/max-replicas: 2
                deployment.kubernetes.io/revision: 1
Controlled By:  Deployment/test-nginx-apps
Replicas:       1 current / 1 desired
Pods Status:    0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=test-nginx-apps
           pod-template-hash=79b9bd9585
  Containers:
   nginx:
    Image:        k8s.gcr.io/nginx:test-cmd
... skipping 39 lines ...
(Bdeployment.apps "test-nginx-apps" deleted
apps.sh:214: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx-with-command created
I0114 16:02:37.488979   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017756-14008", Name:"nginx-with-command", UID:"375ef858-1c54-493b-a85d-c3b3b1f3409a", APIVersion:"apps/v1", ResourceVersion:"2050", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-with-command-757c6f58dd to 1
I0114 16:02:37.491471   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"nginx-with-command-757c6f58dd", UID:"cb4d8cca-7098-45e8-99e6-ac414e0c31fe", APIVersion:"apps/v1", ResourceVersion:"2051", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-with-command-757c6f58dd-rkmjj
apps.sh:218: Successful get deploy nginx-with-command {{(index .spec.template.spec.containers 0).name}}: nginx
(BE0114 16:02:37.608871   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx-with-command" deleted
E0114 16:02:37.687627   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:224: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:02:37.797679   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:37.901565   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/deployment-with-unixuserid created
I0114 16:02:37.937547   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017756-14008", Name:"deployment-with-unixuserid", UID:"074e9273-0823-4e7c-8a26-4b44f2f9d4e7", APIVersion:"apps/v1", ResourceVersion:"2064", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set deployment-with-unixuserid-8fcdfc94f to 1
I0114 16:02:37.942451   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"deployment-with-unixuserid-8fcdfc94f", UID:"2fa6406d-5368-4752-b18f-92816abdfd62", APIVersion:"apps/v1", ResourceVersion:"2065", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: deployment-with-unixuserid-8fcdfc94f-st7fl
apps.sh:228: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: deployment-with-unixuserid:
(Bdeployment.apps "deployment-with-unixuserid" deleted
apps.sh:235: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx-deployment created
I0114 16:02:38.416124   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment", UID:"3b59cfa8-9b8e-4d8f-bf86-4821be998a88", APIVersion:"apps/v1", ResourceVersion:"2078", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
I0114 16:02:38.419841   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment-6986c7bc94", UID:"9fbd7325-803c-413c-aceb-21268fbdf559", APIVersion:"apps/v1", ResourceVersion:"2079", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-vg56q
I0114 16:02:38.422431   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment-6986c7bc94", UID:"9fbd7325-803c-413c-aceb-21268fbdf559", APIVersion:"apps/v1", ResourceVersion:"2079", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-9nshp
I0114 16:02:38.425292   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment-6986c7bc94", UID:"9fbd7325-803c-413c-aceb-21268fbdf559", APIVersion:"apps/v1", ResourceVersion:"2079", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-jbd6n
apps.sh:239: Successful get rs {{range.items}}{{.spec.replicas}}{{end}}: 3
(BE0114 16:02:38.610044   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx-deployment" deleted
E0114 16:02:38.688963   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:242: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:02:38.798855   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:246: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:02:38.902658   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:247: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx-deployment created
I0114 16:02:38.995409   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment", UID:"dbbe2cff-410b-4c24-8fd5-b8ffd4d9305b", APIVersion:"apps/v1", ResourceVersion:"2100", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-7f6fc565b9 to 1
I0114 16:02:39.000139   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment-7f6fc565b9", UID:"16b19250-6278-45b0-ac97-f57e0173cc92", APIVersion:"apps/v1", ResourceVersion:"2101", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-7f6fc565b9-rptc9
apps.sh:251: Successful get rs {{range.items}}{{.spec.replicas}}{{end}}: 1
(Bdeployment.apps "nginx-deployment" deleted
apps.sh:256: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:257: Successful get rs {{range.items}}{{.spec.replicas}}{{end}}: 1
(Breplicaset.apps "nginx-deployment-7f6fc565b9" deleted
E0114 16:02:39.611257   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:39.690226   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:02:39.800083   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment created
I0114 16:02:39.860686   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment", UID:"e5a35cb8-2584-4dec-aa2c-b5d111dd1e3b", APIVersion:"apps/v1", ResourceVersion:"2121", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
I0114 16:02:39.864061   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment-6986c7bc94", UID:"05c63beb-0e6c-4324-b285-2926c9573bd6", APIVersion:"apps/v1", ResourceVersion:"2122", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-chbrq
I0114 16:02:39.867476   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment-6986c7bc94", UID:"05c63beb-0e6c-4324-b285-2926c9573bd6", APIVersion:"apps/v1", ResourceVersion:"2122", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-7gkcx
I0114 16:02:39.867516   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment-6986c7bc94", UID:"05c63beb-0e6c-4324-b285-2926c9573bd6", APIVersion:"apps/v1", ResourceVersion:"2122", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-wmrzq
E0114 16:02:39.903904   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:268: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment:
(Bhorizontalpodautoscaler.autoscaling/nginx-deployment autoscaled
apps.sh:271: Successful get hpa nginx-deployment {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(Bhorizontalpodautoscaler.autoscaling "nginx-deployment" deleted
deployment.apps "nginx-deployment" deleted
apps.sh:279: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx created
I0114 16:02:40.577406   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017756-14008", Name:"nginx", UID:"f9bef4ff-04e9-4a47-8c6b-d190e83ce039", APIVersion:"apps/v1", ResourceVersion:"2145", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-f87d999f7 to 3
I0114 16:02:40.580532   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"nginx-f87d999f7", UID:"eed312d6-c3ff-46e4-9111-11a7481bbbb8", APIVersion:"apps/v1", ResourceVersion:"2146", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-zl9tk
I0114 16:02:40.585045   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"nginx-f87d999f7", UID:"eed312d6-c3ff-46e4-9111-11a7481bbbb8", APIVersion:"apps/v1", ResourceVersion:"2146", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-9tnm9
I0114 16:02:40.585679   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"nginx-f87d999f7", UID:"eed312d6-c3ff-46e4-9111-11a7481bbbb8", APIVersion:"apps/v1", ResourceVersion:"2146", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-mmr5x
E0114 16:02:40.612308   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:283: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(BE0114 16:02:40.691442   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:284: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0114 16:02:40.801403   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx skipped rollback (current template already matches revision 1)
E0114 16:02:40.905218   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:287: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BWarning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
deployment.apps/nginx configured
I0114 16:02:41.115555   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017756-14008", Name:"nginx", UID:"f9bef4ff-04e9-4a47-8c6b-d190e83ce039", APIVersion:"apps/v1", ResourceVersion:"2159", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-78487f9fd7 to 1
I0114 16:02:41.118432   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"nginx-78487f9fd7", UID:"d069b434-3d26-474e-b9ec-c4b836dd755c", APIVersion:"apps/v1", ResourceVersion:"2160", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-78487f9fd7-vzx72
apps.sh:290: Successful get deployment.apps {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(B    Image:	k8s.gcr.io/nginx:test-cmd
apps.sh:293: Successful get deployment.apps {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bdeployment.apps/nginx rolled back
E0114 16:02:41.613421   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:41.692679   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:41.802545   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:41.906391   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:297: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0114 16:02:42.614528   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
error: unable to find specified revision 1000000 in history
E0114 16:02:42.694028   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:300: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0114 16:02:42.803898   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx rolled back
E0114 16:02:42.907507   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:43.615691   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:43.695208   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:43.805064   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:43.908644   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:304: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bdeployment.apps/nginx paused
error: you cannot rollback a paused deployment; resume it first with 'kubectl rollout resume deployment/nginx' and try again
error: deployments.apps "nginx" can't restart paused deployment (run rollout resume first)
deployment.apps/nginx resumed
deployment.apps/nginx rolled back
E0114 16:02:44.617003   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
    deployment.kubernetes.io/revision-history: 1,3
E0114 16:02:44.696305   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
error: desired revision (3) is different from the running revision (5)
E0114 16:02:44.806426   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx restarted
I0114 16:02:44.896396   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017756-14008", Name:"nginx", UID:"f9bef4ff-04e9-4a47-8c6b-d190e83ce039", APIVersion:"apps/v1", ResourceVersion:"2190", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-f87d999f7 to 2
I0114 16:02:44.902629   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"nginx-f87d999f7", UID:"eed312d6-c3ff-46e4-9111-11a7481bbbb8", APIVersion:"apps/v1", ResourceVersion:"2194", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-f87d999f7-mmr5x
I0114 16:02:44.905576   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017756-14008", Name:"nginx", UID:"f9bef4ff-04e9-4a47-8c6b-d190e83ce039", APIVersion:"apps/v1", ResourceVersion:"2193", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-fccbf69d5 to 1
E0114 16:02:44.909448   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 16:02:44.911472   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"nginx-fccbf69d5", UID:"6abea99a-0886-4209-a936-d920ab62ec4a", APIVersion:"apps/v1", ResourceVersion:"2198", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-fccbf69d5-4tm52
E0114 16:02:45.618148   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:45.697546   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:45.807638   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:45.910790   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:apiVersion: apps/v1
kind: ReplicaSet
metadata:
  annotations:
    deployment.kubernetes.io/desired-replicas: "3"
... skipping 56 lines ...
I0114 16:02:46.229723   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"nginx2-57b7865cd9", UID:"df51fe9d-3573-4de5-a411-b794657d918a", APIVersion:"apps/v1", ResourceVersion:"2213", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-57b7865cd9-8r7v2
I0114 16:02:46.234030   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"nginx2-57b7865cd9", UID:"df51fe9d-3573-4de5-a411-b794657d918a", APIVersion:"apps/v1", ResourceVersion:"2213", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-57b7865cd9-6x897
I0114 16:02:46.234650   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"nginx2-57b7865cd9", UID:"df51fe9d-3573-4de5-a411-b794657d918a", APIVersion:"apps/v1", ResourceVersion:"2213", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-57b7865cd9-l558w
deployment.apps "nginx2" deleted
deployment.apps "nginx" deleted
apps.sh:334: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:02:46.619483   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment created
I0114 16:02:46.696581   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment", UID:"81cbad90-8f83-45fb-8f3d-4f8aff3f7bb3", APIVersion:"apps/v1", ResourceVersion:"2246", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-598d4d68b4 to 3
E0114 16:02:46.698627   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 16:02:46.703645   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment-598d4d68b4", UID:"652cd862-e5cd-4ac1-b8c3-d6ff80793d86", APIVersion:"apps/v1", ResourceVersion:"2247", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-7ll47
I0114 16:02:46.708293   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment-598d4d68b4", UID:"652cd862-e5cd-4ac1-b8c3-d6ff80793d86", APIVersion:"apps/v1", ResourceVersion:"2247", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-xvvpv
I0114 16:02:46.708353   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment-598d4d68b4", UID:"652cd862-e5cd-4ac1-b8c3-d6ff80793d86", APIVersion:"apps/v1", ResourceVersion:"2247", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-dsrgb
apps.sh:337: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment:
(BE0114 16:02:46.808782   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:338: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0114 16:02:46.912059   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:339: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment image updated
I0114 16:02:47.092264   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment", UID:"81cbad90-8f83-45fb-8f3d-4f8aff3f7bb3", APIVersion:"apps/v1", ResourceVersion:"2260", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-59df9b5f5b to 1
I0114 16:02:47.097939   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment-59df9b5f5b", UID:"30526215-2682-44f0-a68a-05eb08aeb31b", APIVersion:"apps/v1", ResourceVersion:"2261", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-59df9b5f5b-zkf8k
apps.sh:342: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bapps.sh:343: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Berror: unable to find container named "redis"
deployment.apps/nginx-deployment image updated
apps.sh:348: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0114 16:02:47.620675   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:349: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(BE0114 16:02:47.699785   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment image updated
E0114 16:02:47.809889   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:352: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(BE0114 16:02:47.914140   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:353: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bapps.sh:356: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(BI0114 16:02:48.223916   54489 horizontal.go:353] Horizontal Pod Autoscaler frontend has been deleted in namespace-1579017744-24008
apps.sh:357: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment image updated
I0114 16:02:48.364870   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment", UID:"81cbad90-8f83-45fb-8f3d-4f8aff3f7bb3", APIVersion:"apps/v1", ResourceVersion:"2280", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-598d4d68b4 to 2
I0114 16:02:48.372117   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment-598d4d68b4", UID:"652cd862-e5cd-4ac1-b8c3-d6ff80793d86", APIVersion:"apps/v1", ResourceVersion:"2284", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-598d4d68b4-7ll47
I0114 16:02:48.374103   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment", UID:"81cbad90-8f83-45fb-8f3d-4f8aff3f7bb3", APIVersion:"apps/v1", ResourceVersion:"2283", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-7d758dbc54 to 1
I0114 16:02:48.378642   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment-7d758dbc54", UID:"73b731c7-c9b6-49ce-8389-d77feb67527c", APIVersion:"apps/v1", ResourceVersion:"2288", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-7d758dbc54-x62ww
apps.sh:360: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:361: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0114 16:02:48.621848   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:48.700972   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:364: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0114 16:02:48.811066   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:365: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0114 16:02:48.915147   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx-deployment" deleted
apps.sh:371: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx-deployment created
I0114 16:02:49.197541   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment", UID:"b06fcaec-7fa5-4c3a-8f9c-33eef0f71fe1", APIVersion:"apps/v1", ResourceVersion:"2312", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-598d4d68b4 to 3
I0114 16:02:49.203901   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment-598d4d68b4", UID:"c442a444-43d7-4a76-b4b2-6dc27e0a15bb", APIVersion:"apps/v1", ResourceVersion:"2313", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-lnqr6
I0114 16:02:49.209824   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment-598d4d68b4", UID:"c442a444-43d7-4a76-b4b2-6dc27e0a15bb", APIVersion:"apps/v1", ResourceVersion:"2313", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-5v452
I0114 16:02:49.213882   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment-598d4d68b4", UID:"c442a444-43d7-4a76-b4b2-6dc27e0a15bb", APIVersion:"apps/v1", ResourceVersion:"2313", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-r2rbr
configmap/test-set-env-config created
secret/test-set-env-secret created
E0114 16:02:49.623034   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:376: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment:
(BE0114 16:02:49.702237   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:378: Successful get configmaps/test-set-env-config {{.metadata.name}}: test-set-env-config
(BE0114 16:02:49.812196   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:379: Successful get secret {{range.items}}{{.metadata.name}}:{{end}}: test-set-env-secret:
(BE0114 16:02:49.916188   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment env updated
I0114 16:02:49.933947   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment", UID:"b06fcaec-7fa5-4c3a-8f9c-33eef0f71fe1", APIVersion:"apps/v1", ResourceVersion:"2331", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6b9f7756b4 to 1
I0114 16:02:49.937824   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment-6b9f7756b4", UID:"f37a29f7-b4b9-46e3-9c99-f8ec6638f851", APIVersion:"apps/v1", ResourceVersion:"2332", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6b9f7756b4-ng2wk
apps.sh:383: Successful get deploy nginx-deployment {{ (index (index .spec.template.spec.containers 0).env 0).name}}: KEY_2
(Bapps.sh:385: Successful get deploy nginx-deployment {{ len (index .spec.template.spec.containers 0).env }}: 1
(Bdeployment.apps/nginx-deployment env updated
... skipping 10 lines ...
deployment.apps/nginx-deployment env updated
I0114 16:02:50.524363   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment", UID:"b06fcaec-7fa5-4c3a-8f9c-33eef0f71fe1", APIVersion:"apps/v1", ResourceVersion:"2382", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-598d4d68b4 to 0
I0114 16:02:50.531668   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment-598d4d68b4", UID:"c442a444-43d7-4a76-b4b2-6dc27e0a15bb", APIVersion:"apps/v1", ResourceVersion:"2386", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-598d4d68b4-r2rbr
I0114 16:02:50.533729   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment", UID:"b06fcaec-7fa5-4c3a-8f9c-33eef0f71fe1", APIVersion:"apps/v1", ResourceVersion:"2385", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-5958f7687 to 1
I0114 16:02:50.564861   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment-5958f7687", UID:"01fa80b0-19ec-4cdf-af69-d9e1539edf06", APIVersion:"apps/v1", ResourceVersion:"2390", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-5958f7687-lsws6
deployment.apps/nginx-deployment env updated
E0114 16:02:50.623919   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 16:02:50.632714   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment", UID:"b06fcaec-7fa5-4c3a-8f9c-33eef0f71fe1", APIVersion:"apps/v1", ResourceVersion:"2397", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-6b9f7756b4 to 0
E0114 16:02:50.703288   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment env updated
I0114 16:02:50.733744   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment", UID:"b06fcaec-7fa5-4c3a-8f9c-33eef0f71fe1", APIVersion:"apps/v1", ResourceVersion:"2399", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-98b7fd455 to 1
deployment.apps/nginx-deployment env updated
E0114 16:02:50.813116   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 16:02:50.868971   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017756-14008", Name:"nginx-deployment-6b9f7756b4", UID:"f37a29f7-b4b9-46e3-9c99-f8ec6638f851", APIVersion:"apps/v1", ResourceVersion:"2400", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-6b9f7756b4-ng2wk
deployment.apps "nginx-deployment" deleted
E0114 16:02:50.917185   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:50.964157   54489 replica_set.go:534] sync "namespace-1579017756-14008/nginx-deployment-598d4d68b4" failed with Operation cannot be fulfilled on replicasets.apps "nginx-deployment-598d4d68b4": StorageError: invalid object, Code: 4, Key: /registry/replicasets/namespace-1579017756-14008/nginx-deployment-598d4d68b4, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: c442a444-43d7-4a76-b4b2-6dc27e0a15bb, UID in object meta: 
configmap "test-set-env-config" deleted
E0114 16:02:51.014007   54489 replica_set.go:534] sync "namespace-1579017756-14008/nginx-deployment-98b7fd455" failed with replicasets.apps "nginx-deployment-98b7fd455" not found
secret "test-set-env-secret" deleted
+++ exit code: 0
Recording: run_rs_tests
Running command: run_rs_tests

+++ Running case: test-cmd.run_rs_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_rs_tests
+++ [0114 16:02:51] Creating namespace namespace-1579017771-8799
E0114 16:02:51.214028   54489 replica_set.go:534] sync "namespace-1579017756-14008/nginx-deployment-5958f7687" failed with replicasets.apps "nginx-deployment-5958f7687" not found
namespace/namespace-1579017771-8799 created
E0114 16:02:51.264097   54489 replica_set.go:534] sync "namespace-1579017756-14008/nginx-deployment-6b9f7756b4" failed with replicasets.apps "nginx-deployment-6b9f7756b4" not found
Context "test" modified.
+++ [0114 16:02:51] Testing kubectl(v1:replicasets)
E0114 16:02:51.313584   54489 replica_set.go:534] sync "namespace-1579017756-14008/nginx-deployment-d74969475" failed with replicasets.apps "nginx-deployment-d74969475" not found
apps.sh:511: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicaset.apps/frontend created
I0114 16:02:51.553119   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017771-8799", Name:"frontend", UID:"bdec2bd0-0dd7-48df-ac8e-786d28062d46", APIVersion:"apps/v1", ResourceVersion:"2432", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-mqjlw
I0114 16:02:51.556487   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017771-8799", Name:"frontend", UID:"bdec2bd0-0dd7-48df-ac8e-786d28062d46", APIVersion:"apps/v1", ResourceVersion:"2432", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-ml2fv
I0114 16:02:51.556935   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017771-8799", Name:"frontend", UID:"bdec2bd0-0dd7-48df-ac8e-786d28062d46", APIVersion:"apps/v1", ResourceVersion:"2432", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-gpjhb
+++ [0114 16:02:51] Deleting rs
E0114 16:02:51.624974   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps "frontend" deleted
E0114 16:02:51.704443   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:517: Successful get pods -l "tier=frontend" {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:02:51.764122   54489 replica_set.go:534] sync "namespace-1579017771-8799/frontend" failed with replicasets.apps "frontend" not found
E0114 16:02:51.814352   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:521: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:02:51.918370   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend created
I0114 16:02:51.989135   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017771-8799", Name:"frontend", UID:"dbaedb36-6fea-4c5c-bea2-6e47a112bcd8", APIVersion:"apps/v1", ResourceVersion:"2447", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-q565k
I0114 16:02:51.991778   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017771-8799", Name:"frontend", UID:"dbaedb36-6fea-4c5c-bea2-6e47a112bcd8", APIVersion:"apps/v1", ResourceVersion:"2447", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-dw5rq
I0114 16:02:51.994755   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017771-8799", Name:"frontend", UID:"dbaedb36-6fea-4c5c-bea2-6e47a112bcd8", APIVersion:"apps/v1", ResourceVersion:"2447", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-f8zz9
apps.sh:525: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
(B+++ [0114 16:02:52] Deleting rs
replicaset.apps "frontend" deleted
E0114 16:02:52.215040   54489 replica_set.go:534] sync "namespace-1579017771-8799/frontend" failed with replicasets.apps "frontend" not found
apps.sh:529: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:531: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
(Bpod "frontend-dw5rq" deleted
pod "frontend-f8zz9" deleted
pod "frontend-q565k" deleted
apps.sh:534: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:02:52.626064   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:538: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:02:52.705693   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend created
I0114 16:02:52.807496   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017771-8799", Name:"frontend", UID:"052093e8-5e90-4ad3-b336-a23cb6a60537", APIVersion:"apps/v1", ResourceVersion:"2466", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-5m7kk
I0114 16:02:52.809704   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017771-8799", Name:"frontend", UID:"052093e8-5e90-4ad3-b336-a23cb6a60537", APIVersion:"apps/v1", ResourceVersion:"2466", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-pnw9h
I0114 16:02:52.809774   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017771-8799", Name:"frontend", UID:"052093e8-5e90-4ad3-b336-a23cb6a60537", APIVersion:"apps/v1", ResourceVersion:"2466", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-cxmnv
E0114 16:02:52.815176   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:542: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(BE0114 16:02:52.919634   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Pod Template:
matched Labels:
matched Selector:
matched Replicas:
matched Pods Status:
... skipping 3 lines ...
Namespace:    namespace-1579017771-8799
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
Namespace:    namespace-1579017771-8799
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 18 lines ...
Namespace:    namespace-1579017771-8799
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 12 lines ...
Namespace:    namespace-1579017771-8799
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 25 lines ...
Namespace:    namespace-1579017771-8799
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
Namespace:    namespace-1579017771-8799
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 9 lines ...
Events:
  Type    Reason            Age   From                   Message
  ----    ------            ----  ----                   -------
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-5m7kk
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-pnw9h
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-cxmnv
(BE0114 16:02:53.627222   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:         frontend
Namespace:    namespace-1579017771-8799
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 3 lines ...
      cpu:     100m
      memory:  100Mi
    Environment:
      GET_HOSTS_FROM:  dns
    Mounts:            <none>
  Volumes:             <none>
(BE0114 16:02:53.706830   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:         frontend
Namespace:    namespace-1579017771-8799
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 9 lines ...
Events:
  Type    Reason            Age   From                   Message
  ----    ------            ----  ----                   -------
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-5m7kk
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-pnw9h
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-cxmnv
(BE0114 16:02:53.816584   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Image:
matched Node:
matched Labels:
matched Status:
matched Controlled By
... skipping 80 lines ...
    Mounts:            <none>
Volumes:               <none>
QoS Class:             Burstable
Node-Selectors:        <none>
Tolerations:           <none>
Events:                <none>
(BE0114 16:02:53.920794   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:564: Successful get rs frontend {{.spec.replicas}}: 3
(Breplicaset.apps/frontend scaled
E0114 16:02:54.056134   54489 replica_set.go:199] ReplicaSet has no controller: &ReplicaSet{ObjectMeta:{frontend  namespace-1579017771-8799 /apis/apps/v1/namespaces/namespace-1579017771-8799/replicasets/frontend 052093e8-5e90-4ad3-b336-a23cb6a60537 2477 2 2020-01-14 16:02:52 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []},Spec:ReplicaSetSpec{Replicas:*2,Selector:&v1.LabelSelector{MatchLabels:map[string]string{app: guestbook,tier: frontend,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{      0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []} {[] [] [{php-redis gcr.io/google_samples/gb-frontend:v3 [] []  [{ 0 80 TCP }] [] [{GET_HOSTS_FROM dns nil}] {map[] map[cpu:{{100 -3} {<nil>} 100m DecimalSI} memory:{{104857600 0} {<nil>} 100Mi BinarySI}]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent nil false false false}] [] Always 0xc00356e338 <nil> ClusterFirst map[]   <nil>  false false false <nil> PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,} []   nil default-scheduler [] []  <nil> nil [] <nil> <nil> <nil> map[] []}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:3,FullyLabeledReplicas:3,ObservedGeneration:1,ReadyReplicas:0,AvailableReplicas:0,Conditions:[]ReplicaSetCondition{},},}
I0114 16:02:54.061052   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017771-8799", Name:"frontend", UID:"052093e8-5e90-4ad3-b336-a23cb6a60537", APIVersion:"apps/v1", ResourceVersion:"2477", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-5m7kk
apps.sh:568: Successful get rs frontend {{.spec.replicas}}: 2
(Bdeployment.apps/scale-1 created
I0114 16:02:54.314592   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017771-8799", Name:"scale-1", UID:"96773036-b904-45f8-9da3-e31434877f05", APIVersion:"apps/v1", ResourceVersion:"2483", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-1-5c5565bcd9 to 1
I0114 16:02:54.318504   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017771-8799", Name:"scale-1-5c5565bcd9", UID:"2d888ec2-b30d-4ad6-afe3-363b77cb8854", APIVersion:"apps/v1", ResourceVersion:"2484", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-1-5c5565bcd9-drwnf
deployment.apps/scale-2 created
I0114 16:02:54.492014   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017771-8799", Name:"scale-2", UID:"81f19a1d-7e6f-44ad-be30-d7eb726864d6", APIVersion:"apps/v1", ResourceVersion:"2493", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-2-5c5565bcd9 to 1
I0114 16:02:54.495465   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017771-8799", Name:"scale-2-5c5565bcd9", UID:"4632a8ae-b969-4f75-b26c-97a3a1ef16f9", APIVersion:"apps/v1", ResourceVersion:"2494", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-2-5c5565bcd9-wmf8r
E0114 16:02:54.628633   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/scale-3 created
I0114 16:02:54.671503   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017771-8799", Name:"scale-3", UID:"c83a7412-6c2b-4dff-a1d1-b71e1f312dcb", APIVersion:"apps/v1", ResourceVersion:"2502", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-3-5c5565bcd9 to 1
I0114 16:02:54.675282   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017771-8799", Name:"scale-3-5c5565bcd9", UID:"3d613875-f283-45d3-8403-d3e6a3bd42e6", APIVersion:"apps/v1", ResourceVersion:"2503", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-5c5565bcd9-27bck
E0114 16:02:54.707765   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:574: Successful get deploy scale-1 {{.spec.replicas}}: 1
(BE0114 16:02:54.817923   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:575: Successful get deploy scale-2 {{.spec.replicas}}: 1
(BE0114 16:02:54.922020   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:576: Successful get deploy scale-3 {{.spec.replicas}}: 1
(Bdeployment.apps/scale-1 scaled
I0114 16:02:55.035090   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017771-8799", Name:"scale-1", UID:"96773036-b904-45f8-9da3-e31434877f05", APIVersion:"apps/v1", ResourceVersion:"2512", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-1-5c5565bcd9 to 2
deployment.apps/scale-2 scaled
I0114 16:02:55.036632   54489 horizontal.go:353] Horizontal Pod Autoscaler nginx-deployment has been deleted in namespace-1579017756-14008
I0114 16:02:55.038179   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017771-8799", Name:"scale-1-5c5565bcd9", UID:"2d888ec2-b30d-4ad6-afe3-363b77cb8854", APIVersion:"apps/v1", ResourceVersion:"2513", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-1-5c5565bcd9-j8fdj
... skipping 11 lines ...
I0114 16:02:55.400343   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017771-8799", Name:"scale-2-5c5565bcd9", UID:"4632a8ae-b969-4f75-b26c-97a3a1ef16f9", APIVersion:"apps/v1", ResourceVersion:"2540", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-2-5c5565bcd9-2bj5p
I0114 16:02:55.401765   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017771-8799", Name:"scale-3", UID:"c83a7412-6c2b-4dff-a1d1-b71e1f312dcb", APIVersion:"apps/v1", ResourceVersion:"2543", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-3-5c5565bcd9 to 3
I0114 16:02:55.405401   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017771-8799", Name:"scale-3-5c5565bcd9", UID:"3d613875-f283-45d3-8403-d3e6a3bd42e6", APIVersion:"apps/v1", ResourceVersion:"2547", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-5c5565bcd9-z2lxt
I0114 16:02:55.409533   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017771-8799", Name:"scale-3-5c5565bcd9", UID:"3d613875-f283-45d3-8403-d3e6a3bd42e6", APIVersion:"apps/v1", ResourceVersion:"2547", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-5c5565bcd9-vmg8w
apps.sh:584: Successful get deploy scale-1 {{.spec.replicas}}: 3
(Bapps.sh:585: Successful get deploy scale-2 {{.spec.replicas}}: 3
(BE0114 16:02:55.629900   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:586: Successful get deploy scale-3 {{.spec.replicas}}: 3
(BE0114 16:02:55.709103   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps "frontend" deleted
E0114 16:02:55.818989   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "scale-1" deleted
deployment.apps "scale-2" deleted
deployment.apps "scale-3" deleted
E0114 16:02:55.923116   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:55.963973   54489 replica_set.go:534] sync "namespace-1579017771-8799/scale-3-5c5565bcd9" failed with replicasets.apps "scale-3-5c5565bcd9" not found
replicaset.apps/frontend created
I0114 16:02:56.040989   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017771-8799", Name:"frontend", UID:"45ecfb2d-d156-47b2-af34-7bf931299a81", APIVersion:"apps/v1", ResourceVersion:"2595", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-tf4nw
I0114 16:02:56.066044   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017771-8799", Name:"frontend", UID:"45ecfb2d-d156-47b2-af34-7bf931299a81", APIVersion:"apps/v1", ResourceVersion:"2595", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-9vjd2
I0114 16:02:56.116014   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017771-8799", Name:"frontend", UID:"45ecfb2d-d156-47b2-af34-7bf931299a81", APIVersion:"apps/v1", ResourceVersion:"2595", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-lhqfn
apps.sh:594: Successful get rs frontend {{.spec.replicas}}: 3
(Bservice/frontend exposed
apps.sh:598: Successful get service frontend {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(Bservice/frontend-2 exposed
apps.sh:602: Successful get service frontend-2 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: default 80
(Bservice "frontend" deleted
service "frontend-2" deleted
E0114 16:02:56.630998   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:56.710376   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:608: Successful get rs frontend {{.metadata.generation}}: 1
(Breplicaset.apps/frontend image updated
E0114 16:02:56.820396   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:610: Successful get rs frontend {{.metadata.generation}}: 2
(BE0114 16:02:56.924305   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend env updated
apps.sh:612: Successful get rs frontend {{.metadata.generation}}: 3
(Breplicaset.apps/frontend resource requirements updated
apps.sh:614: Successful get rs frontend {{.metadata.generation}}: 4
(Bapps.sh:618: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Breplicaset.apps "frontend" deleted
apps.sh:622: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:02:57.632270   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:626: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:02:57.711683   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:02:57.821808   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend created
I0114 16:02:57.859442   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017771-8799", Name:"frontend", UID:"ead533ed-d678-4dcc-a5a3-bc192d6de99c", APIVersion:"apps/v1", ResourceVersion:"2631", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-rqz49
I0114 16:02:57.863257   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017771-8799", Name:"frontend", UID:"ead533ed-d678-4dcc-a5a3-bc192d6de99c", APIVersion:"apps/v1", ResourceVersion:"2631", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-jnl8x
I0114 16:02:57.864134   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017771-8799", Name:"frontend", UID:"ead533ed-d678-4dcc-a5a3-bc192d6de99c", APIVersion:"apps/v1", ResourceVersion:"2631", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-pvwch
E0114 16:02:57.925611   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/redis-slave created
I0114 16:02:58.029689   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017771-8799", Name:"redis-slave", UID:"bb364285-f8ce-42d0-ba36-f1990f407c66", APIVersion:"apps/v1", ResourceVersion:"2640", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-sxwtg
I0114 16:02:58.034675   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017771-8799", Name:"redis-slave", UID:"bb364285-f8ce-42d0-ba36-f1990f407c66", APIVersion:"apps/v1", ResourceVersion:"2640", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-wlr9s
apps.sh:631: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(Bapps.sh:635: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(Breplicaset.apps "frontend" deleted
replicaset.apps "redis-slave" deleted
apps.sh:639: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:644: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:02:58.633445   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend created
I0114 16:02:58.695062   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017771-8799", Name:"frontend", UID:"89618e37-1258-47f6-926d-1efd1b16468a", APIVersion:"apps/v1", ResourceVersion:"2660", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-78j9f
I0114 16:02:58.697971   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017771-8799", Name:"frontend", UID:"89618e37-1258-47f6-926d-1efd1b16468a", APIVersion:"apps/v1", ResourceVersion:"2660", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-4lprs
I0114 16:02:58.699841   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017771-8799", Name:"frontend", UID:"89618e37-1258-47f6-926d-1efd1b16468a", APIVersion:"apps/v1", ResourceVersion:"2660", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-stkql
E0114 16:02:58.712517   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:647: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(BE0114 16:02:58.823134   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
horizontalpodautoscaler.autoscaling/frontend autoscaled
E0114 16:02:58.926865   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:650: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 70
(Bhorizontalpodautoscaler.autoscaling "frontend" deleted
horizontalpodautoscaler.autoscaling/frontend autoscaled
apps.sh:654: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(Bhorizontalpodautoscaler.autoscaling "frontend" deleted
Error: required flag(s) "max" not set
replicaset.apps "frontend" deleted
+++ exit code: 0
Recording: run_stateful_set_tests
Running command: run_stateful_set_tests

+++ Running case: test-cmd.run_stateful_set_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_stateful_set_tests
+++ [0114 16:02:59] Creating namespace namespace-1579017779-32118
E0114 16:02:59.634489   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1579017779-32118 created
E0114 16:02:59.713847   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0114 16:02:59] Testing kubectl(v1:statefulsets)
E0114 16:02:59.824366   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:470: Successful get statefulset {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:02:59.928169   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 16:03:00.019907   51025 controller.go:606] quota admission added evaluator for: statefulsets.apps
statefulset.apps/nginx created
apps.sh:476: Successful get statefulset nginx {{.spec.replicas}}: 0
(Bapps.sh:477: Successful get statefulset nginx {{.status.observedGeneration}}: 1
(Bstatefulset.apps/nginx scaled
I0114 16:03:00.293300   54489 event.go:278] Event(v1.ObjectReference{Kind:"StatefulSet", Namespace:"namespace-1579017779-32118", Name:"nginx", UID:"eb56219a-1b2d-42a8-b344-e895e7bdee80", APIVersion:"apps/v1", ResourceVersion:"2688", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' create Pod nginx-0 in StatefulSet nginx successful
apps.sh:481: Successful get statefulset nginx {{.spec.replicas}}: 1
(Bapps.sh:482: Successful get statefulset nginx {{.status.observedGeneration}}: 2
(BE0114 16:03:00.635595   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps/nginx restarted
E0114 16:03:00.715062   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:490: Successful get statefulset nginx {{.status.observedGeneration}}: 3
(Bstatefulset.apps "nginx" deleted
I0114 16:03:00.801232   54489 stateful_set.go:420] StatefulSet has been deleted namespace-1579017779-32118/nginx
E0114 16:03:00.825452   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
E0114 16:03:00.929393   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_statefulset_history_tests
Running command: run_statefulset_history_tests

+++ Running case: test-cmd.run_statefulset_history_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_statefulset_history_tests
... skipping 3 lines ...
+++ [0114 16:03:01] Testing kubectl(v1:statefulsets, v1:controllerrevisions)
apps.sh:418: Successful get statefulset {{range.items}}{{.metadata.name}}:{{end}}: 
(Bstatefulset.apps/nginx created
apps.sh:422: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"StatefulSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"app":"nginx-statefulset"},"name":"nginx","namespace":"namespace-1579017780-2623"},"spec":{"replicas":0,"selector":{"matchLabels":{"app":"nginx-statefulset"}},"serviceName":"nginx","template":{"metadata":{"labels":{"app":"nginx-statefulset"}},"spec":{"containers":[{"command":["sh","-c","while true; do sleep 1; done"],"image":"k8s.gcr.io/nginx-slim:0.7","name":"nginx","ports":[{"containerPort":80,"name":"web"}]}],"terminationGracePeriodSeconds":5}},"updateStrategy":{"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
(Bstatefulset.apps/nginx skipped rollback (current template already matches revision 1)
E0114 16:03:01.636805   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:425: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(BE0114 16:03:01.716373   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:426: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BE0114 16:03:01.826617   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps/nginx configured
E0114 16:03:01.930506   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:429: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
(Bapps.sh:430: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:431: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bapps.sh:432: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"StatefulSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"app":"nginx-statefulset"},"name":"nginx","namespace":"namespace-1579017780-2623"},"spec":{"replicas":0,"selector":{"matchLabels":{"app":"nginx-statefulset"}},"serviceName":"nginx","template":{"metadata":{"labels":{"app":"nginx-statefulset"}},"spec":{"containers":[{"command":["sh","-c","while true; do sleep 1; done"],"image":"k8s.gcr.io/nginx-slim:0.7","name":"nginx","ports":[{"containerPort":80,"name":"web"}]}],"terminationGracePeriodSeconds":5}},"updateStrategy":{"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"StatefulSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-statefulset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"app":"nginx-statefulset"},"name":"nginx","namespace":"namespace-1579017780-2623"},"spec":{"replicas":0,"selector":{"matchLabels":{"app":"nginx-statefulset"}},"serviceName":"nginx","template":{"metadata":{"labels":{"app":"nginx-statefulset"}},"spec":{"containers":[{"command":["sh","-c","while true; do sleep 1; done"],"image":"k8s.gcr.io/nginx-slim:0.8","name":"nginx","ports":[{"containerPort":80,"name":"web"}]},{"image":"k8s.gcr.io/pause:2.0","name":"pause","ports":[{"containerPort":81,"name":"web-2"}]}],"terminationGracePeriodSeconds":5}},"updateStrategy":{"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-statefulset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
... skipping 11 lines ...
    Environment:	<none>
    Mounts:	<none>
  Volumes:	<none>
 (dry run)
apps.sh:435: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
(Bapps.sh:436: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(BE0114 16:03:02.638149   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:437: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(BE0114 16:03:02.717583   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps/nginx rolled back
E0114 16:03:02.827745   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:440: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(BE0114 16:03:02.931636   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:441: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BSuccessful
message:error: unable to find specified revision 1000000 in history
has:unable to find specified revision
apps.sh:445: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(Bapps.sh:446: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(Bstatefulset.apps/nginx rolled back
apps.sh:449: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
(Bapps.sh:450: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:451: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(BE0114 16:03:03.639336   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps "nginx" deleted
I0114 16:03:03.705413   54489 stateful_set.go:420] StatefulSet has been deleted namespace-1579017780-2623/nginx
E0114 16:03:03.718551   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
E0114 16:03:03.829002   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_lists_tests
Running command: run_lists_tests

+++ Running case: test-cmd.run_lists_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_lists_tests
+++ [0114 16:03:03] Creating namespace namespace-1579017783-5184
E0114 16:03:03.932794   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1579017783-5184 created
Context "test" modified.
+++ [0114 16:03:04] Testing kubectl(v1:lists)
service/list-service-test created
deployment.apps/list-deployment-test created
I0114 16:03:04.204413   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017783-5184", Name:"list-deployment-test", UID:"76a0e938-3ab0-4a23-82ca-2344be83196c", APIVersion:"apps/v1", ResourceVersion:"2726", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set list-deployment-test-7cd8c5ff6d to 1
... skipping 10 lines ...
+++ [0114 16:03:04] Creating namespace namespace-1579017784-32724
namespace/namespace-1579017784-32724 created
Context "test" modified.
+++ [0114 16:03:04] Testing kubectl(v1:multiple resources)
Testing with file hack/testdata/multi-resource-yaml.yaml and replace with file hack/testdata/multi-resource-yaml-modify.yaml
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:03:04.640637   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:03:04.719644   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:04.830346   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock created
replicationcontroller/mock created
I0114 16:03:04.869627   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017784-32724", Name:"mock", UID:"19b4fac3-ac72-48e3-9401-10df774a6a89", APIVersion:"v1", ResourceVersion:"2748", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-5lb27
E0114 16:03:04.933983   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:72: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(Bgeneric-resources.sh:80: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BNAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
service/mock   ClusterIP   10.0.0.169   <none>        99/TCP    1s

NAME                         DESIRED   CURRENT   READY   AGE
... skipping 15 lines ...
Name:         mock
Namespace:    namespace-1579017784-32724
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 8 lines ...
service "mock" deleted
replicationcontroller "mock" deleted
service/mock replaced
replicationcontroller/mock replaced
I0114 16:03:05.506375   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017784-32724", Name:"mock", UID:"53aed99c-6fa4-48c1-8156-a369dcc80955", APIVersion:"v1", ResourceVersion:"2764", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-b8w7v
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(BE0114 16:03:05.641729   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(BE0114 16:03:05.720864   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:05.831565   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock edited
replicationcontroller/mock edited
E0114 16:03:05.935229   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(Bservice/mock labeled
replicationcontroller/mock labeled
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(Bgeneric-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(Bservice/mock annotated
replicationcontroller/mock annotated
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(Bgeneric-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(BE0114 16:03:06.642820   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock" deleted
replicationcontroller "mock" deleted
E0114 16:03:06.722339   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Testing with file hack/testdata/multi-resource-list.json and replace with file hack/testdata/multi-resource-list-modify.json
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:03:06.832634   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:03:06.937060   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock created
replicationcontroller/mock created
I0114 16:03:07.080635   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017784-32724", Name:"mock", UID:"a1a0100c-724b-497e-9ebc-73d483d26834", APIVersion:"v1", ResourceVersion:"2787", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-tcnlv
generic-resources.sh:72: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(Bgeneric-resources.sh:80: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BNAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
... skipping 18 lines ...
Name:         mock
Namespace:    namespace-1579017784-32724
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 2 lines ...
    Mounts:       <none>
  Volumes:        <none>
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: mock-tcnlv
E0114 16:03:07.644258   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:07.723638   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock" deleted
replicationcontroller "mock" deleted
service/mock replaced
replicationcontroller/mock replaced
I0114 16:03:07.760461   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017784-32724", Name:"mock", UID:"41c47e18-971e-45b2-b31f-3760ef6f4444", APIVersion:"v1", ResourceVersion:"2802", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-46lf2
E0114 16:03:07.833795   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(BE0114 16:03:07.938421   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(Bservice/mock edited
replicationcontroller/mock edited
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(Bservice/mock labeled
replicationcontroller/mock labeled
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(Bgeneric-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(BE0114 16:03:08.645908   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock annotated
replicationcontroller/mock annotated
E0114 16:03:08.724571   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(BE0114 16:03:08.834981   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(BE0114 16:03:08.939762   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock" deleted
replicationcontroller "mock" deleted
Testing with file hack/testdata/multi-resource-json.json and replace with file hack/testdata/multi-resource-json-modify.json
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Bservice/mock created
replicationcontroller/mock created
I0114 16:03:09.413103   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017784-32724", Name:"mock", UID:"9e3137e5-68cd-448b-a212-952a216a0e96", APIVersion:"v1", ResourceVersion:"2827", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-vltjp
generic-resources.sh:72: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(Bgeneric-resources.sh:80: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BE0114 16:03:09.647123   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
NAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
service/mock   ClusterIP   10.0.0.133   <none>        99/TCP    0s

NAME                         DESIRED   CURRENT   READY   AGE
replicationcontroller/mock   1         1         0       0s
E0114 16:03:09.725850   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:09.836182   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Name:              mock
Namespace:         namespace-1579017784-32724
Labels:            app=mock
Annotations:       <none>
Selector:          app=mock
Type:              ClusterIP
... skipping 8 lines ...
Name:         mock
Namespace:    namespace-1579017784-32724
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 2 lines ...
    Mounts:       <none>
  Volumes:        <none>
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: mock-vltjp
E0114 16:03:09.940960   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock" deleted
replicationcontroller "mock" deleted
service/mock replaced
replicationcontroller/mock replaced
I0114 16:03:10.066129   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017784-32724", Name:"mock", UID:"093175e3-d616-4ab1-9181-92cb12602989", APIVersion:"v1", ResourceVersion:"2842", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-9rnhj
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(Bgeneric-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(Bservice/mock edited
replicationcontroller/mock edited
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(BE0114 16:03:10.648255   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock labeled
replicationcontroller/mock labeled
E0114 16:03:10.726906   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(BE0114 16:03:10.837349   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(Bservice/mock annotated
replicationcontroller/mock annotated
E0114 16:03:10.941912   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(Bgeneric-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(Bservice "mock" deleted
replicationcontroller "mock" deleted
Testing with file hack/testdata/multi-resource-rclist.json and replace with file hack/testdata/multi-resource-rclist-modify.json
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/mock created
I0114 16:03:11.555551   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017784-32724", Name:"mock", UID:"242bc1c4-182b-47fa-ab7e-b48c62a7f4fc", APIVersion:"v1", ResourceVersion:"2863", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-bfcft
replicationcontroller/mock2 created
I0114 16:03:11.560155   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017784-32724", Name:"mock2", UID:"029c9fad-6d17-44ba-8821-3a1c39b9c0be", APIVersion:"v1", ResourceVersion:"2865", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock2-qsfh8
E0114 16:03:11.649275   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:78: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:mock2:
(BE0114 16:03:11.727914   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
NAME    DESIRED   CURRENT   READY   AGE
mock    1         1         0       0s
mock2   1         1         0       0s
E0114 16:03:11.838477   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Name:         mock
Namespace:    namespace-1579017784-32724
Selector:     app=mock
Labels:       app=mock
              status=replaced
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 11 lines ...
Namespace:    namespace-1579017784-32724
Selector:     app=mock2
Labels:       app=mock2
              status=replaced
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock2
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 2 lines ...
    Mounts:       <none>
  Volumes:        <none>
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: mock2-qsfh8
E0114 16:03:11.943179   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller "mock" deleted
replicationcontroller "mock2" deleted
replicationcontroller/mock replaced
I0114 16:03:12.078164   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017784-32724", Name:"mock", UID:"86b3646d-e49f-4132-941b-8724bbcd3818", APIVersion:"v1", ResourceVersion:"2879", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-6v2qh
replicationcontroller/mock2 replaced
I0114 16:03:12.083404   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017784-32724", Name:"mock2", UID:"24f21bcf-02ca-4c5d-a01a-f5cee9ac7707", APIVersion:"v1", ResourceVersion:"2881", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock2-t8gx5
generic-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(Bgeneric-resources.sh:104: Successful get rc mock2 {{.metadata.labels.status}}: replaced
(Breplicationcontroller/mock edited
replicationcontroller/mock2 edited
generic-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:122: Successful get rc mock2 {{.metadata.labels.status}}: edited
(BE0114 16:03:12.650483   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:12.729152   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/mock labeled
replicationcontroller/mock2 labeled
generic-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(BE0114 16:03:12.839720   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:142: Successful get rc mock2 {{.metadata.labels.labeled}}: true
(BE0114 16:03:12.944560   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/mock annotated
replicationcontroller/mock2 annotated
generic-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(Bgeneric-resources.sh:161: Successful get rc mock2 {{.metadata.annotations.annotated}}: true
(Breplicationcontroller "mock" deleted
replicationcontroller "mock2" deleted
Testing with file hack/testdata/multi-resource-svclist.json and replace with file hack/testdata/multi-resource-svclist-modify.json
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Bservice/mock created
service/mock2 created
E0114 16:03:13.651648   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:70: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:mock2:
(BE0114 16:03:13.730366   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
NAME    TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
mock    ClusterIP   10.0.0.96    <none>        99/TCP    0s
mock2   ClusterIP   10.0.0.136   <none>        99/TCP    0s
E0114 16:03:13.840975   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 16:03:13.882347   54489 horizontal.go:353] Horizontal Pod Autoscaler frontend has been deleted in namespace-1579017771-8799
Name:              mock
Namespace:         namespace-1579017784-32724
Labels:            app=mock
Annotations:       <none>
Selector:          app=mock
... skipping 15 lines ...
IP:                10.0.0.136
Port:              <unset>  99/TCP
TargetPort:        9949/TCP
Endpoints:         <none>
Session Affinity:  None
Events:            <none>
E0114 16:03:13.945664   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock" deleted
service "mock2" deleted
service/mock replaced
service/mock2 replaced
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(Bgeneric-resources.sh:98: Successful get services mock2 {{.metadata.labels.status}}: replaced
(Bservice/mock edited
service/mock2 edited
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(BE0114 16:03:14.652815   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:116: Successful get services mock2 {{.metadata.labels.status}}: edited
(BE0114 16:03:14.731489   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock labeled
service/mock2 labeled
E0114 16:03:14.842270   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(Bgeneric-resources.sh:136: Successful get services mock2 {{.metadata.labels.labeled}}: true
(BE0114 16:03:14.946788   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock annotated
service/mock2 annotated
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(Bgeneric-resources.sh:155: Successful get services mock2 {{.metadata.annotations.annotated}}: true
(Bservice "mock" deleted
service "mock2" deleted
generic-resources.sh:173: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:174: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:03:15.653926   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:15.732689   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock created
replicationcontroller/mock created
I0114 16:03:15.743654   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017784-32724", Name:"mock", UID:"2891ca5e-0563-4b9e-bcb6-41979a668072", APIVersion:"v1", ResourceVersion:"2942", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-ncx7n
generic-resources.sh:180: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BE0114 16:03:15.843469   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:181: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BE0114 16:03:15.947968   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock" deleted
replicationcontroller "mock" deleted
generic-resources.sh:187: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:188: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
Recording: run_persistent_volumes_tests
... skipping 4 lines ...
+++ command: run_persistent_volumes_tests
+++ [0114 16:03:16] Creating namespace namespace-1579017796-30338
namespace/namespace-1579017796-30338 created
Context "test" modified.
+++ [0114 16:03:16] Testing persistent volumes
storage.sh:30: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:03:16.655076   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:16.734029   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolume/pv0001 created
E0114 16:03:16.821544   54489 pv_protection_controller.go:116] PV pv0001 failed with : Operation cannot be fulfilled on persistentvolumes "pv0001": the object has been modified; please apply your changes to the latest version and try again
E0114 16:03:16.844626   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:33: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0001:
(BE0114 16:03:16.949338   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolume "pv0001" deleted
persistentvolume/pv0002 created
storage.sh:36: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0002:
(Bpersistentvolume "pv0002" deleted
persistentvolume/pv0003 created
storage.sh:39: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0003:
(BE0114 16:03:17.656222   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:17.735214   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolume "pv0003" deleted
storage.sh:42: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:03:17.845853   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:17.950569   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolume/pv0001 created
storage.sh:45: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0001:
(BSuccessful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
persistentvolume "pv0001" deleted
has:warning: deleting cluster-scoped resources
... skipping 10 lines ...
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_persistent_volume_claims_tests
+++ [0114 16:03:18] Creating namespace namespace-1579017798-30984
namespace/namespace-1579017798-30984 created
Context "test" modified.
+++ [0114 16:03:18] Testing persistent volumes claims
E0114 16:03:18.657402   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:64: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:03:18.736463   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolumeclaim/myclaim-1 created
I0114 16:03:18.845813   54489 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1579017798-30984", Name:"myclaim-1", UID:"2ec257c2-8655-4ad9-89d0-36e7ac532010", APIVersion:"v1", ResourceVersion:"2979", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
E0114 16:03:18.847217   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 16:03:18.850195   54489 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1579017798-30984", Name:"myclaim-1", UID:"2ec257c2-8655-4ad9-89d0-36e7ac532010", APIVersion:"v1", ResourceVersion:"2981", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
E0114 16:03:18.951753   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:67: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-1:
(Bpersistentvolumeclaim "myclaim-1" deleted
I0114 16:03:19.034349   54489 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1579017798-30984", Name:"myclaim-1", UID:"2ec257c2-8655-4ad9-89d0-36e7ac532010", APIVersion:"v1", ResourceVersion:"2983", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
persistentvolumeclaim/myclaim-2 created
I0114 16:03:19.203930   54489 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1579017798-30984", Name:"myclaim-2", UID:"2df89019-9d97-4656-948e-cf1156166f63", APIVersion:"v1", ResourceVersion:"2986", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
I0114 16:03:19.207019   54489 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1579017798-30984", Name:"myclaim-2", UID:"2df89019-9d97-4656-948e-cf1156166f63", APIVersion:"v1", ResourceVersion:"2988", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
storage.sh:71: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-2:
(Bpersistentvolumeclaim "myclaim-2" deleted
I0114 16:03:19.396897   54489 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1579017798-30984", Name:"myclaim-2", UID:"2df89019-9d97-4656-948e-cf1156166f63", APIVersion:"v1", ResourceVersion:"2990", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
persistentvolumeclaim/myclaim-3 created
I0114 16:03:19.577538   54489 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1579017798-30984", Name:"myclaim-3", UID:"807270a4-5f51-42c8-9c5a-c428db4cc052", APIVersion:"v1", ResourceVersion:"2995", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
I0114 16:03:19.581866   54489 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1579017798-30984", Name:"myclaim-3", UID:"807270a4-5f51-42c8-9c5a-c428db4cc052", APIVersion:"v1", ResourceVersion:"2997", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
E0114 16:03:19.659716   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:75: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-3:
(BE0114 16:03:19.737707   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolumeclaim "myclaim-3" deleted
I0114 16:03:19.761271   54489 event.go:278] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1579017798-30984", Name:"myclaim-3", UID:"807270a4-5f51-42c8-9c5a-c428db4cc052", APIVersion:"v1", ResourceVersion:"3000", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
E0114 16:03:19.849133   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:78: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
Recording: run_storage_class_tests
Running command: run_storage_class_tests

+++ Running case: test-cmd.run_storage_class_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_storage_class_tests
+++ [0114 16:03:19] Testing storage class
E0114 16:03:19.952960   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:92: Successful get storageclass {{range.items}}{{.metadata.name}}:{{end}}: 
(Bstorageclass.storage.k8s.io/storage-class-name created
storage.sh:108: Successful get storageclass {{range.items}}{{.metadata.name}}:{{end}}: storage-class-name:
(Bstorage.sh:109: Successful get sc {{range.items}}{{.metadata.name}}:{{end}}: storage-class-name:
(Bstorageclass.storage.k8s.io "storage-class-name" deleted
storage.sh:112: Successful get storageclass {{range.items}}{{.metadata.name}}:{{end}}: 
... skipping 2 lines ...
Running command: run_nodes_tests

+++ Running case: test-cmd.run_nodes_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_nodes_tests
+++ [0114 16:03:20] Testing kubectl(v1:nodes)
E0114 16:03:20.660936   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1375: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(BE0114 16:03:20.738817   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Labels:
matched CreationTimestamp:
matched Conditions:
matched Addresses:
matched Capacity:
... skipping 41 lines ...
  Resource           Requests  Limits
  --------           --------  ------
  cpu                0 (0%)    0 (0%)
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
Events:              <none>
(BE0114 16:03:20.850319   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1379: Successful describe
Name:               127.0.0.1
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Tue, 14 Jan 2020 15:59:04 +0000
... skipping 35 lines ...
  --------           --------  ------
  cpu                0 (0%)    0 (0%)
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
Events:              <none>
(B
E0114 16:03:20.954097   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1381: Successful describe
Name:               127.0.0.1
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Tue, 14 Jan 2020 15:59:04 +0000
... skipping 272 lines ...
  --------           --------  ------
  cpu                0 (0%)    0 (0%)
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
Events:              <none>
(Bcore.sh:1395: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE0114 16:03:21.662066   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 patched
E0114 16:03:21.739620   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1398: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: true
(BE0114 16:03:21.851614   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 patched
E0114 16:03:21.955401   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1401: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Btokenreview.authentication.k8s.io/<unknown> created
tokenreview.authentication.k8s.io/<unknown> created
+++ exit code: 0
Recording: run_authorization_tests
Running command: run_authorization_tests
... skipping 80 lines ...
Successful
message:yes
has:yes
Successful
message:yes
has:yes
E0114 16:03:22.663216   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:22.740779   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:22.852737   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Warning: the server doesn't have a resource type 'invalid_resource'
yes
has:the server doesn't have a resource type
Successful
message:yes
has:yes
E0114 16:03:22.956500   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: --subresource can not be used with NonResourceURL
has:subresource can not be used with NonResourceURL
Successful
Successful
message:yes
0
has:0
... skipping 11 lines ...
message:Warning: the server doesn't have a resource type 'foo'
yes
has not:Warning: resource 'foo' is not namespace scoped
Successful
message:yes
has not:Warning
E0114 16:03:23.664220   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Warning: resource 'nodes' is not namespace scoped
yes
has:Warning: resource 'nodes' is not namespace scoped
E0114 16:03:23.750950   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:yes
has not:Warning: resource 'nodes' is not namespace scoped
E0114 16:03:23.854115   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
clusterrole.rbac.authorization.k8s.io/testing-CR reconciled
	reconciliation required create
	missing rules added:
		{Verbs:[create delete deletecollection get list patch update watch] APIGroups:[] Resources:[pods] ResourceNames:[] NonResourceURLs:[]}
clusterrolebinding.rbac.authorization.k8s.io/testing-CRB reconciled
	reconciliation required create
... skipping 4 lines ...
	missing subjects added:
		{Kind:Group APIGroup:rbac.authorization.k8s.io Name:system:masters Namespace:}
role.rbac.authorization.k8s.io/testing-R reconciled
	reconciliation required create
	missing rules added:
		{Verbs:[get list watch] APIGroups:[] Resources:[configmaps] ResourceNames:[] NonResourceURLs:[]}
E0114 16:03:23.957782   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
legacy-script.sh:821: Successful get rolebindings -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-RB:
(Blegacy-script.sh:822: Successful get roles -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-R:
(Blegacy-script.sh:823: Successful get clusterrolebindings -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CRB:
(Blegacy-script.sh:824: Successful get clusterroles -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CR:
(BSuccessful
message:error: only rbac.authorization.k8s.io/v1 is supported: not *v1beta1.ClusterRole
has:only rbac.authorization.k8s.io/v1 is supported
rolebinding.rbac.authorization.k8s.io "testing-RB" deleted
role.rbac.authorization.k8s.io "testing-R" deleted
warning: deleting cluster-scoped resources, not scoped to the provided namespace
clusterrole.rbac.authorization.k8s.io "testing-CR" deleted
clusterrolebinding.rbac.authorization.k8s.io "testing-CRB" deleted
... skipping 2 lines ...

+++ Running case: test-cmd.run_retrieve_multiple_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_retrieve_multiple_tests
Context "test" modified.
+++ [0114 16:03:24] Testing kubectl(v1:multiget)
E0114 16:03:24.665390   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:242: Successful get nodes/127.0.0.1 service/kubernetes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:kubernetes:
(B+++ exit code: 0
Recording: run_resource_aliasing_tests
Running command: run_resource_aliasing_tests
E0114 16:03:24.752143   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource

+++ Running case: test-cmd.run_resource_aliasing_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_resource_aliasing_tests
+++ [0114 16:03:24] Creating namespace namespace-1579017804-32494
namespace/namespace-1579017804-32494 created
E0114 16:03:24.855101   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0114 16:03:24] Testing resource aliasing
E0114 16:03:24.958909   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/cassandra created
I0114 16:03:25.082623   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017804-32494", Name:"cassandra", UID:"007d7cc8-d151-4f9b-a1ee-fcb0ca06ef8d", APIVersion:"v1", ResourceVersion:"3026", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-cm44j
I0114 16:03:25.086072   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017804-32494", Name:"cassandra", UID:"007d7cc8-d151-4f9b-a1ee-fcb0ca06ef8d", APIVersion:"v1", ResourceVersion:"3026", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-2zjj4
service/cassandra created
Waiting for Get all -l'app=cassandra' {{range.items}}{{range .metadata.labels}}{{.}}:{{end}}{{end}} : expected: cassandra:cassandra:cassandra:cassandra::, got: cassandra:cassandra:cassandra:cassandra:

discovery.sh:91: FAIL!
Get all -l'app=cassandra' {{range.items}}{{range .metadata.labels}}{{.}}:{{end}}{{end}}
  Expected: cassandra:cassandra:cassandra:cassandra::
  Got:      cassandra:cassandra:cassandra:cassandra:
(B
55 /home/prow/go/src/k8s.io/kubernetes/hack/lib/test.sh
(B
discovery.sh:92: Successful get all -l'app=cassandra' {{range.items}}{{range .metadata.labels}}{{.}}:{{end}}{{end}}: cassandra:cassandra:cassandra:cassandra:
(Bpod "cassandra-2zjj4" deleted
I0114 16:03:25.576086   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017804-32494", Name:"cassandra", UID:"007d7cc8-d151-4f9b-a1ee-fcb0ca06ef8d", APIVersion:"v1", ResourceVersion:"3032", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-ssq9k
pod "cassandra-cm44j" deleted
I0114 16:03:25.584456   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017804-32494", Name:"cassandra", UID:"007d7cc8-d151-4f9b-a1ee-fcb0ca06ef8d", APIVersion:"v1", ResourceVersion:"3032", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-d55vd
replicationcontroller "cassandra" deleted
E0114 16:03:25.590007   54489 replica_set.go:534] sync "namespace-1579017804-32494/cassandra" failed with replicationcontrollers "cassandra" not found
service "cassandra" deleted
+++ exit code: 0
E0114 16:03:25.666506   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_kubectl_explain_tests
Running command: run_kubectl_explain_tests

+++ Running case: test-cmd.run_kubectl_explain_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_explain_tests
+++ [0114 16:03:25] Testing kubectl(v1:explain)
E0114 16:03:25.753375   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
KIND:     Pod
VERSION:  v1

DESCRIPTION:
     Pod is a collection of containers that can run on a host. This resource is
     created by clients and scheduled onto hosts.
... skipping 21 lines ...

   status	<Object>
     Most recently observed status of the pod. This data may not be up to date.
     Populated by the system. Read-only. More info:
     https://git.k8s.io/community/contributors/devel/sig-architecture/api-conventions.md#spec-and-status

E0114 16:03:25.856134   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:25.960286   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
KIND:     Pod
VERSION:  v1

DESCRIPTION:
     Pod is a collection of containers that can run on a host. This resource is
     created by clients and scheduled onto hosts.
... skipping 77 lines ...
Running command: run_kubectl_sort_by_tests

+++ Running case: test-cmd.run_kubectl_sort_by_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_sort_by_tests
+++ [0114 16:03:26] Testing kubectl --sort-by
E0114 16:03:26.667670   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:256: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:03:26.754532   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
No resources found in namespace-1579017804-32494 namespace.
No resources found in namespace-1579017804-32494 namespace.
E0114 16:03:26.857214   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:264: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:03:26.961440   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/valid-pod created
get.sh:268: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BSuccessful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          0s
has:valid-pod
... skipping 27 lines ...
NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          0s
has:includeObject=Object
get.sh:279: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
E0114 16:03:27.668773   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:283: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:03:27.755858   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:288: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:03:27.858506   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/sorted-pod1 created
E0114 16:03:27.962697   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:292: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:
(Bpod/sorted-pod2 created
get.sh:296: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:sorted-pod2:
(Bpod/sorted-pod3 created
get.sh:300: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:sorted-pod2:sorted-pod3:
(BSuccessful
message:sorted-pod1:sorted-pod2:sorted-pod3:
has:sorted-pod1:sorted-pod2:sorted-pod3:
E0114 16:03:28.669939   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:sorted-pod3:sorted-pod2:sorted-pod1:
has:sorted-pod3:sorted-pod2:sorted-pod1:
E0114 16:03:28.757130   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:sorted-pod2:sorted-pod1:sorted-pod3:
has:sorted-pod2:sorted-pod1:sorted-pod3:
E0114 16:03:28.859753   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:sorted-pod1:sorted-pod2:sorted-pod3:
has:sorted-pod1:sorted-pod2:sorted-pod3:
E0114 16:03:28.964035   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:I0114:I0114:I0114:I0114:I0114:I0114:I0114:I0114:I0114:I0114:NAME:sorted-pod2:sorted-pod1:sorted-pod3:
has:sorted-pod2:sorted-pod1:sorted-pod3:
Successful
message:I0114 16:03:29.003532   86655 loader.go:375] Config loaded from file:  /tmp/tmp.75KxkBT0hy/.kube/config
I0114 16:03:29.013295   86655 round_trippers.go:420] GET http://localhost:8080/api/v1/namespaces/namespace-1579017804-32494/pods
... skipping 24 lines ...
+++ Running case: test-cmd.run_kubectl_all_namespace_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_all_namespace_tests
+++ [0114 16:03:29] Testing kubectl --all-namespace
get.sh:342: Successful get namespaces {{range.items}}{{if eq .metadata.name \"default\"}}{{.metadata.name}}:{{end}}{{end}}: default:
(Bget.sh:346: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0114 16:03:29.671235   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:29.758322   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/valid-pod created
E0114 16:03:29.860893   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:350: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BNAMESPACE                    NAME        READY   STATUS    RESTARTS   AGE
namespace-1579017804-32494   valid-pod   0/1     Pending   0          0s
E0114 16:03:29.965103   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/all-ns-test-1 created
serviceaccount/test created
namespace/all-ns-test-2 created
serviceaccount/test created
Successful
message:NAMESPACE                    NAME      SECRETS   AGE
... skipping 117 lines ...
namespace-1579017796-30338   default   0         14s
namespace-1579017798-30984   default   0         12s
namespace-1579017804-32494   default   0         6s
some-other-random            default   0         7s
has:all-ns-test-2
namespace "all-ns-test-1" deleted
E0114 16:03:30.672428   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:30.759570   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:30.862222   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:30.966398   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:31.673761   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:31.760872   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:31.863569   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:31.967644   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:32.674944   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:32.762054   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:32.864786   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:32.968779   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:33.676085   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:33.763375   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:33.865905   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:33.970047   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:34.677305   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:34.764639   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:34.867100   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:34.971222   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace "all-ns-test-2" deleted
E0114 16:03:35.678210   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:35.765804   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:35.868262   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:35.972510   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:36.679477   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:36.767139   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:36.869659   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:36.973661   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:37.680534   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:37.768399   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:37.870943   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:37.974773   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:38.681620   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:38.769773   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:38.872087   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:38.975951   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:39.682816   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:39.771062   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:39.873288   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:39.977198   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 16:03:40.593689   54489 namespace_controller.go:185] Namespace has been deleted all-ns-test-1
E0114 16:03:40.684928   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:40.771856   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:376: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BE0114 16:03:40.874549   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
E0114 16:03:40.978525   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:380: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bget.sh:384: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(BSuccessful
message:NAME        STATUS     ROLES    AGE     VERSION
127.0.0.1   NotReady   <none>   4m37s   
has not:NAMESPACE
... skipping 7 lines ...
+++ [0114 16:03:41] Testing --template support on commands
+++ [0114 16:03:41] Creating namespace namespace-1579017821-30617
namespace/namespace-1579017821-30617 created
Context "test" modified.
template-output.sh:29: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/valid-pod created
E0114 16:03:41.685925   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
{
    "apiVersion": "v1",
    "items": [
        {
            "apiVersion": "v1",
            "kind": "Pod",
... skipping 94 lines ...
    "kind": "List",
    "metadata": {
        "resourceVersion": "",
        "selfLink": ""
    }
}
E0114 16:03:41.772999   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
template-output.sh:35: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BE0114 16:03:41.875774   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:valid-pod:
has:valid-pod:
E0114 16:03:41.979722   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:valid-pod:
has:valid-pod:
Successful
message:valid-pod:
has:valid-pod:
... skipping 11 lines ...
has:redis-slave:
kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
Successful
message:nginx:
has:nginx:
E0114 16:03:42.687025   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
kubectl run --generator=job/v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
Successful
message:pi:
has:pi:
E0114 16:03:42.774195   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:127.0.0.1:
has:127.0.0.1:
E0114 16:03:42.876965   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 untainted
E0114 16:03:42.980930   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/cassandra created
I0114 16:03:43.073354   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017821-30617", Name:"cassandra", UID:"71b68c2e-c366-4a00-b5fd-59d383184957", APIVersion:"v1", ResourceVersion:"3108", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-hw5c6
I0114 16:03:43.077098   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017821-30617", Name:"cassandra", UID:"71b68c2e-c366-4a00-b5fd-59d383184957", APIVersion:"v1", ResourceVersion:"3108", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-kfswh
Successful
message:cassandra:
has:cassandra:
... skipping 23 lines ...
has:cm:
I0114 16:03:43.609398   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017821-30617", Name:"deploy", UID:"5bff93dc-05fe-48ab-8105-7be99092d8b4", APIVersion:"apps/v1", ResourceVersion:"3119", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set deploy-74bcc58696 to 1
I0114 16:03:43.613221   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017821-30617", Name:"deploy-74bcc58696", UID:"4f95931f-c577-4c58-8a27-8be620054dd2", APIVersion:"apps/v1", ResourceVersion:"3120", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: deploy-74bcc58696-wjw5t
Successful
message:deploy:
has:deploy:
E0114 16:03:43.688264   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:43.775269   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
cronjob.batch/pi created
Successful
message:foo:
has:foo:
E0114 16:03:43.878313   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:bar:
has:bar:
E0114 16:03:43.982096   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
Successful
message:myrole:
has:myrole:
... skipping 15 lines ...
Successful
message:valid-pod:
has:valid-pod:
Successful
message:valid-pod:
has:valid-pod:
E0114 16:03:44.689465   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:kubernetes:
has:kubernetes:
E0114 16:03:44.776473   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:valid-pod:
has:valid-pod:
E0114 16:03:44.879374   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
Successful
message:foo:
has:foo:
E0114 16:03:44.983261   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
Successful
message:foo:
has:foo:
... skipping 30 lines ...
Successful
message:deploy:
has:deploy:
Successful
message:deploy:
has:deploy:
E0114 16:03:45.690756   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:deploy:
has:deploy:
I0114 16:03:45.763785   54489 namespace_controller.go:185] Namespace has been deleted all-ns-test-2
E0114 16:03:45.777658   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:deploy:
has:deploy:
E0114 16:03:45.880551   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Config:
has:Config
Successful
message:apiVersion: v1
kind: ConfigMap
metadata:
  creationTimestamp: null
  name: cm
has:kind: ConfigMap
E0114 16:03:45.992417   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
cronjob.batch "pi" deleted
pod "cassandra-hw5c6" deleted
I0114 16:03:46.155801   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017821-30617", Name:"cassandra", UID:"71b68c2e-c366-4a00-b5fd-59d383184957", APIVersion:"v1", ResourceVersion:"3114", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-j4t75
pod "cassandra-kfswh" deleted
I0114 16:03:46.165436   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1579017821-30617", Name:"cassandra", UID:"71b68c2e-c366-4a00-b5fd-59d383184957", APIVersion:"v1", ResourceVersion:"3144", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-ts7rg
pod "deploy-74bcc58696-wjw5t" deleted
... skipping 8 lines ...
Running command: run_certificates_tests

+++ Running case: test-cmd.run_certificates_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_certificates_tests
+++ [0114 16:03:46] Testing certificates
E0114 16:03:46.691921   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:46.778965   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo created
E0114 16:03:46.881910   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:29: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
(BE0114 16:03:46.993667   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo approved
{
    "apiVersion": "v1",
    "items": [
        {
            "apiVersion": "certificates.k8s.io/v1beta1",
... skipping 37 lines ...
}
certificate.sh:32: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: Approved
(Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
certificate.sh:34: Successful get csr {{range.items}}{{.metadata.name}}{{end}}: 
(Bcertificatesigningrequest.certificates.k8s.io/foo created
certificate.sh:37: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
(BE0114 16:03:47.693197   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo approved
E0114 16:03:47.780222   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
{
    "apiVersion": "v1",
    "items": [
        {
            "apiVersion": "certificates.k8s.io/v1beta1",
            "kind": "CertificateSigningRequest",
... skipping 49 lines ...
    "kind": "List",
    "metadata": {
        "resourceVersion": "",
        "selfLink": ""
    }
}
E0114 16:03:47.883072   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:40: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: Approved
(BE0114 16:03:47.994856   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io "foo" deleted
certificate.sh:42: Successful get csr {{range.items}}{{.metadata.name}}{{end}}: 
(Bcertificatesigningrequest.certificates.k8s.io/foo created
certificate.sh:46: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
(Bcertificatesigningrequest.certificates.k8s.io/foo denied
{
... skipping 37 lines ...
    "metadata": {
        "resourceVersion": "",
        "selfLink": ""
    }
}
certificate.sh:49: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: Denied
(BE0114 16:03:48.694412   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io "foo" deleted
E0114 16:03:48.781522   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:51: Successful get csr {{range.items}}{{.metadata.name}}{{end}}: 
(BE0114 16:03:48.884191   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:48.996011   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo created
certificate.sh:54: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
(Bcertificatesigningrequest.certificates.k8s.io/foo denied
{
    "apiVersion": "v1",
    "items": [
... skipping 63 lines ...
Running command: run_cluster_management_tests

+++ Running case: test-cmd.run_cluster_management_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_cluster_management_tests
+++ [0114 16:03:49] Testing cluster-management commands
E0114 16:03:49.695637   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:27: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(BE0114 16:03:49.782591   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:49.885411   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/test-pod-1 created
E0114 16:03:49.997148   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/test-pod-2 created
node-management.sh:76: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: 
(Bnode/127.0.0.1 tainted
node-management.sh:79: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: dedicated=foo:PreferNoSchedule
(Bnode/127.0.0.1 untainted
node-management.sh:83: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: 
(Bnode-management.sh:87: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode/127.0.0.1 cordoned (dry run)
E0114 16:03:50.696712   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:89: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE0114 16:03:50.783868   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:93: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE0114 16:03:50.886611   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 cordoned (dry run)
node/127.0.0.1 drained (dry run)
E0114 16:03:50.998329   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:96: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(Bnode-management.sh:97: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode-management.sh:101: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode-management.sh:103: Successful get pods {{range .items}}{{.metadata.name}},{{end}}: test-pod-1,test-pod-2,
(Bnode/127.0.0.1 cordoned
node/127.0.0.1 drained
node-management.sh:106: Successful get pods/test-pod-2 {{.metadata.name}}: test-pod-2
(Bpod "test-pod-2" deleted
node/127.0.0.1 uncordoned
node-management.sh:111: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE0114 16:03:51.697869   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:115: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE0114 16:03:51.784997   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:node/127.0.0.1 already uncordoned (dry run)
has:already uncordoned
E0114 16:03:51.887785   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:119: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE0114 16:03:51.999342   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 labeled
node-management.sh:124: Successful get nodes 127.0.0.1 {{.metadata.labels.test}}: label
(BSuccessful
message:error: cannot specify both a node name and a --selector option
See 'kubectl drain -h' for help and examples
has:cannot specify both a node name
Successful
message:error: USAGE: cordon NODE [flags]
See 'kubectl cordon -h' for help and examples
has:error\: USAGE\: cordon NODE
node/127.0.0.1 already uncordoned
Successful
message:error: You must provide one or more resources by argument or filename.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
   '<resource> <name>'
   '<resource>'
has:must provide one or more resources
... skipping 9 lines ...
Running command: run_plugins_tests

+++ Running case: test-cmd.run_plugins_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_plugins_tests
+++ [0114 16:03:52] Testing kubectl plugins
E0114 16:03:52.699126   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/version/kubectl-version
  - warning: kubectl-version overwrites existing command: "kubectl version"

error: one plugin warning was found
has:kubectl-version overwrites existing command: "kubectl version"
E0114 16:03:52.786564   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/kubectl-foo
test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo
  - warning: test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin: test/fixtures/pkg/kubectl/plugins/kubectl-foo

error: one plugin warning was found
has:test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin
E0114 16:03:52.889128   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/kubectl-foo
has:plugins are available
Successful
message:Unable read directory "test/fixtures/pkg/kubectl/plugins/empty" from your PATH: open test/fixtures/pkg/kubectl/plugins/empty: no such file or directory. Skipping...
error: unable to find any kubectl plugins in your PATH
has:unable to find any kubectl plugins in your PATH
E0114 16:03:53.000759   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:I am plugin foo
has:plugin foo
Successful
message:I am plugin bar called with args test/fixtures/pkg/kubectl/plugins/bar/kubectl-bar arg1
has:test/fixtures/pkg/kubectl/plugins/bar/kubectl-bar arg1
... skipping 9 lines ...

+++ Running case: test-cmd.run_impersonation_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_impersonation_tests
+++ [0114 16:03:53] Testing impersonation
Successful
message:error: requesting groups or user-extra for  without impersonating a user
has:without impersonating a user
certificatesigningrequest.certificates.k8s.io/foo created
E0114 16:03:53.700256   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
authorization.sh:68: Successful get csr/foo {{.spec.username}}: user1
(BE0114 16:03:53.787686   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
authorization.sh:69: Successful get csr/foo {{range .spec.groups}}{{.}}{{end}}: system:authenticated
(BE0114 16:03:53.890650   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io "foo" deleted
E0114 16:03:54.002156   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo created
authorization.sh:74: Successful get csr/foo {{len .spec.groups}}: 3
(Bauthorization.sh:75: Successful get csr/foo {{range .spec.groups}}{{.}} {{end}}: group2 group1 ,,,chameleon 
(Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
+++ exit code: 0
Recording: run_wait_tests
... skipping 5 lines ...
+++ [0114 16:03:54] Testing kubectl wait
+++ [0114 16:03:54] Creating namespace namespace-1579017834-24065
namespace/namespace-1579017834-24065 created
Context "test" modified.
deployment.apps/test-1 created
I0114 16:03:54.700812   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017834-24065", Name:"test-1", UID:"1ddfec7f-15a8-4098-ab66-9431d8d29ab2", APIVersion:"apps/v1", ResourceVersion:"3210", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-1-6d98955cc9 to 1
E0114 16:03:54.701191   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 16:03:54.707584   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017834-24065", Name:"test-1-6d98955cc9", UID:"836dbcb7-cc47-45df-b904-d9266c8ab65e", APIVersion:"apps/v1", ResourceVersion:"3211", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-1-6d98955cc9-6qf6s
deployment.apps/test-2 created
I0114 16:03:54.787052   54489 event.go:278] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1579017834-24065", Name:"test-2", UID:"7110c706-0c15-4c9e-8fdb-be13d053e0f0", APIVersion:"apps/v1", ResourceVersion:"3220", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-2-65897ff84d to 1
E0114 16:03:54.790194   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0114 16:03:54.793521   54489 event.go:278] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1579017834-24065", Name:"test-2-65897ff84d", UID:"dd93a363-d735-4562-980e-23b01270296d", APIVersion:"apps/v1", ResourceVersion:"3221", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-2-65897ff84d-pn7bw
wait.sh:36: Successful get deployments {{range .items}}{{.metadata.name}},{{end}}: test-1,test-2,
(BE0114 16:03:54.891978   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:55.003593   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:55.702488   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:55.791502   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:55.893454   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:56.004940   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:56.703673   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:56.793000   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0114 16:03:56.894872   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "test-1" deleted
deployment.apps "test-2" deleted
Successful
message:deployment.apps/test-1 condition met
deployment.apps/test-2 condition met
has:test-1 condition met
Successful
message:deployment.apps/test-1 condition met
deployment.apps/test-2 condition met
has:test-2 condition met
+++ exit code: 0
E0114 16:03:57.006373   54489 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
No resources found
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
No resources found
+++ [0114 16:03:57] TESTS PASSED
I0114 16:03:57.241285   51025 dynamic_serving_content.go:144] Shutting down serving-cert::/tmp/apiserver.crt::/tmp/apiserver.key
... skipping 37 lines ...
I0114 16:03:57.244946   51025 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 16:03:57.244944   51025 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 16:03:57.245041   51025 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 16:03:57.245052   51025 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 16:03:57.245109   51025 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 16:03:57.245152   51025 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 16:03:57.245198   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 16:03:57.245220   51025 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 16:03:57.245340   51025 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 16:03:57.245340   51025 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 16:03:57.245420   51025 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 16:03:57.245448   51025 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 16:03:57.245449   51025 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
... skipping 13 lines ...
I0114 16:03:57.246399   51025 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 16:03:57.246399   51025 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 16:03:57.246478   51025 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 16:03:57.246501   51025 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 16:03:57.246693   51025 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 16:03:57.246715   51025 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 16:03:57.246842   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.246907   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.246949   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.246958   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.246973   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247006   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247016   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247024   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247043   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247060   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247080   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247084   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247095   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247111   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247121   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247128   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247135   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247186   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247190   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247201   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247238   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247241   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247257   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247268   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247288   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247294   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247327   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247333   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247345   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247085   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247363   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.246979   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247377   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247375   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247399   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247411   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247426   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247433   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247441   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.246974   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247007   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247454   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247478   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247484   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247494   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247530   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247544   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247044   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247572   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247614   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247636   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247647   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247662   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247738   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247781   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 16:03:57.247798   51025 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I0114 16:03:57.247798   51025 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 16:03:57.247840   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247893   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 16:03:57.247902   51025 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 16:03:57.247938   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0114 16:03:57.247944   51025 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W0114 16:03:57.247972   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247972   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.247986   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.248093   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.248093   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.248193   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:57.248329   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
junit report dir: /logs/artifacts
+++ [0114 16:03:57] Clean up complete
+ make test-integration
W0114 16:03:58.245840   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248205   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248293   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248309   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248363   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248403   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248429   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248453   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248499   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248518   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248548   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248366   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248570   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248573   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248605   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248603   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248622   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248643   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248667   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248681   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248689   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248699   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248717   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248742   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248745   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248755   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248758   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248755   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248699   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248783   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248722   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248795   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248817   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248818   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248837   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248845   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248682   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248854   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248856   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248509   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248818   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248902   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248913   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248948   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248951   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248959   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248984   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248964   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248998   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.249013   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.248914   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.249015   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.249036   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.249045   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.249048   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.249069   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.249070   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.249195   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.249200   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.249198   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.249200   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.249198   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.249240   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.249304   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.249551   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:58.249567   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.547995   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.572619   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.583313   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.589186   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.596238   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.603454   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.605914   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.630425   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.631299   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.647119   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.650951   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.661672   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.673991   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.685643   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.687679   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.698819   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.700720   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.701577   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.707644   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.708809   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.710020   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.719469   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.748093   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.751081   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.752196   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.754268   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.766636   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.793027   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.794303   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.795647   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.801749   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.805066   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.809319   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.820994   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.826426   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.830931   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.839822   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.861322   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.873785   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.881533   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.897440   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.900657   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.907977   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.914381   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.923978   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.952364   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.971363   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.980392   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.982659   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:03:59.987102   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:04:00.001721   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:04:00.016136   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:04:00.032295   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:04:00.039137   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:04:00.043274   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:04:00.051206   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:04:00.053793   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:04:00.058171   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:04:00.059565   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:04:00.079971   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:04:00.090355   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:04:00.115362   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:04:00.141307   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:04:00.156676   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:04:00.157981   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:04:00.159085   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
+++ [0114 16:04:01] Checking etcd is on PATH
/home/prow/go/src/k8s.io/kubernetes/third_party/etcd/etcd
+++ [0114 16:04:01] Starting etcd instance
W0114 16:04:01.744559   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0114 16:04:01.761427   51025 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
etcd --advertise-client-urls http://127.0.0.1:2379 --data-dir /tmp/tmp.GDzeK6Om0S --listen-client-urls http://127.0.0.1:2379 --debug > "/logs/artifacts/etcd.51735ae7-36e5-11ea-9f20-3687633bf296.root.log.DEBUG.20200114-160401.90689" 2>/dev/null
Waiting for etcd to come up.
E0114 16:04:02.506533   51025 controller.go:183] StorageError: key not found, Code: 1, Key: /registry/masterleases/10.60.175.54, ResourceVersion: 0, AdditionalErrorMsg: 
+++ [0114 16:04:02] On try 2, etcd: : {"health":"true"}
{"header":{"cluster_id":"14841639068965178418","member_id":"10276657743932975437","revision":"2","raft_term":"2"}}+++ [0114 16:04:02] Running integration test cases
+++ [0114 16:04:06] Running tests without code coverage
... skipping 309 lines ...
    synthetic_master_test.go:721: UPDATE_NODE_APISERVER is not set

=== SKIP: test/integration/scheduler_perf TestSchedule100Node3KPods (0.00s)
    scheduler_test.go:73: Skipping because we want to run short tests


=== Failed
=== FAIL: test/integration/client TestDynamicClient (6.89s)
I0114 16:06:21.182919  106102 controller.go:180] Shutting down kubernetes service endpoint reconciler
I0114 16:06:21.183409  106102 dynamic_cafile_content.go:181] Shutting down request-header::/tmp/kubernetes-kube-apiserver633882346/proxy-ca.crt
I0114 16:06:21.183528  106102 controller.go:123] Shutting down OpenAPI controller
I0114 16:06:21.183683  106102 nonstructuralschema_controller.go:197] Shutting down NonStructuralSchemaConditionController
I0114 16:06:21.183775  106102 naming_controller.go:300] Shutting down NamingConditionController
I0114 16:06:21.183859  106102 apiapproval_controller.go:196] Shutting down KubernetesAPIApprovalPolicyConformantConditionController
... skipping 9 lines ...
I0114 16:06:21.184620  106102 dynamic_cafile_content.go:181] Shutting down request-header::/tmp/kubernetes-kube-apiserver633882346/proxy-ca.crt
I0114 16:06:21.184642  106102 dynamic_cafile_content.go:181] Shutting down client-ca-bundle::/tmp/kubernetes-kube-apiserver633882346/client-ca.crt
I0114 16:06:21.184741  106102 secure_serving.go:222] Stopped listening on 127.0.0.1:35591
I0114 16:06:21.184752  106102 tlsconfig.go:256] Shutting down DynamicServingCertificateController
I0114 16:06:21.184793  106102 dynamic_serving_content.go:144] Shutting down serving-cert::/tmp/kubernetes-kube-apiserver633882346/apiserver.crt::/tmp/kubernetes-kube-apiserver633882346/apiserver.key
I0114 16:06:21.184818  106102 dynamic_cafile_content.go:181] Shutting down client-ca-bundle::/tmp/kubernetes-kube-apiserver633882346/client-ca.crt
E0114 16:06:21.187215  106102 reflector.go:320] k8s.io/kube-aggregator/pkg/client/informers/externalversions/factory.go:117: Failed to watch *v1.APIService: Get https://127.0.0.1:35591/apis/apiregistration.k8s.io/v1/apiservices?allowWatchBookmarks=true&resourceVersion=7699&timeout=5m53s&timeoutSeconds=353&watch=true: dial tcp 127.0.0.1:35591: connect: connection refused
I0114 16:06:22.137951  106102 serving.go:307] Generated self-signed cert (/tmp/kubernetes-kube-apiserver144043841/apiserver.crt, /tmp/kubernetes-kube-apiserver144043841/apiserver.key)
I0114 16:06:22.137978  106102 server.go:596] external host was not specified, using 127.0.0.1
W0114 16:06:22.137989  106102 authentication.go:439] AnonymousAuth is not allowed with the AlwaysAllow authorizer. Resetting AnonymousAuth to false. You should use a different authorizer
E0114 16:06:22.771830  106102 controller.go:183] an error on the server ("") has prevented the request from succeeding (get endpoints kubernetes)
W0114 16:06:22.866774  106102 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 16:06:22.866803  106102 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 16:06:22.866812  106102 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 16:06:22.866958  106102 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 16:06:22.867778  106102 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
W0114 16:06:22.867812  106102 mutation_detector.go:50] Mutation detector is enabled, this will result in memory leakage.
... skipping 206 lines ...
I0114 16:06:26.925156  106102 cluster_authentication_trust_controller.go:440] Starting cluster_authentication_trust_controller controller
I0114 16:06:26.925414  106102 shared_informer.go:206] Waiting for caches to sync for cluster_authentication_trust_controller
I0114 16:06:26.926079  106102 dynamic_cafile_content.go:166] Starting client-ca-bundle::/tmp/kubernetes-kube-apiserver144043841/client-ca.crt
I0114 16:06:26.926135  106102 dynamic_cafile_content.go:166] Starting request-header::/tmp/kubernetes-kube-apiserver144043841/proxy-ca.crt
I0114 16:06:26.922606  106102 controller.go:86] Starting OpenAPI controller
E0114 16:06:26.922417  106102 controller.go:151] Unable to remove old endpoints from kubernetes service: StorageError: key not found, Code: 1, Key: /7c5bfd04-b5e5-476d-8e5b-c8020b42003e/registry/masterleases/127.0.0.1, ResourceVersion: 0, AdditionalErrorMsg: 
E0114 16:06:26.929387  106102 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0114 16:06:26.935532  106102 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0114 16:06:26.944245  106102 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0114 16:06:26.951711  106102 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0114 16:06:26.953929  106102 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
E0114 16:06:26.972731  106102 structuredmerge.go:102] [SHOULD NOT HAPPEN] failed to create typed new object: .spec.rules: element 0: associative list without keys has an element that's a map type
I0114 16:06:27.008497  106102 cache.go:39] Caches are synced for autoregister controller
I0114 16:06:27.008539  106102 cache.go:39] Caches are synced for AvailableConditionController controller
I0114 16:06:27.008580  106102 cache.go:39] Caches are synced for APIServiceRegistrationController controller
I0114 16:06:27.009044  106102 shared_informer.go:213] Caches are synced for crd-autoregister 
I0114 16:06:27.025772  106102 shared_informer.go:213] Caches are synced for cluster_authentication_trust_controller 
I0114 16:06:27.907011  106102 controller.go:107] OpenAPI AggregationController: Processing item 
... skipping 26 lines ...
    testserver.go:198: Waiting for /healthz to be ok...
    dynamic_client_test.go:88: unexpected pod in list. wanted &v1.Pod{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"test5zrxk", GenerateName:"test", Namespace:"default", SelfLink:"/api/v1/namespaces/default/pods/test5zrxk", UID:"ccd16e83-dc94-4a95-8822-9dd38c054556", ResourceVersion:"8163", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63714614788, loc:(*time.Location)(0x753ec80)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"client.test", Operation:"Update", APIVersion:"v1", Time:(*v1.Time)(0xc03adbc760), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc03adbc780)}}}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"test", Image:"test-image", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"Always", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc03ab3bd58), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc0391fd7a0), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(nil), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration{v1.Toleration{Key:"node.kubernetes.io/not-ready", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc03ab3bde0)}, v1.Toleration{Key:"node.kubernetes.io/unreachable", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc03ab3be20)}}, HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(0xc03ab3be28), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(0xc03ab3be2c), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}, Status:v1.PodStatus{Phase:"Pending", Conditions:[]v1.PodCondition(nil), Message:"", Reason:"", NominatedNodeName:"", HostIP:"", PodIP:"", PodIPs:[]v1.PodIP(nil), StartTime:(*v1.Time)(nil), InitContainerStatuses:[]v1.ContainerStatus(nil), ContainerStatuses:[]v1.ContainerStatus(nil), QOSClass:"BestEffort", EphemeralContainerStatuses:[]v1.ContainerStatus(nil)}}, got &v1.Pod{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"test5zrxk", GenerateName:"test", Namespace:"default", SelfLink:"/api/v1/namespaces/default/pods/test5zrxk", UID:"ccd16e83-dc94-4a95-8822-9dd38c054556", ResourceVersion:"8163", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63714614788, loc:(*time.Location)(0x753ec80)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"client.test", Operation:"Update", APIVersion:"v1", Time:(*v1.Time)(0xc03a85e7c0), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc03a85e7a0)}}}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"test", Image:"test-image", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"Always", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc03a905028), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc03915b3e0), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(nil), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration{v1.Toleration{Key:"node.kubernetes.io/not-ready", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc03a905090)}, v1.Toleration{Key:"node.kubernetes.io/unreachable", Operator:"Exists", Value:"", Effect:"NoExecute", TolerationSeconds:(*int64)(0xc03a9050c0)}}, HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(0xc03a904fd8), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(0xc03a904fa9), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}, Status:v1.PodStatus{Phase:"Pending", Conditions:[]v1.PodCondition(nil), Message:"", Reason:"", NominatedNodeName:"", HostIP:"", PodIP:"", PodIPs:[]v1.PodIP(nil), StartTime:(*v1.Time)(nil), InitContainerStatuses:[]v1.ContainerStatus(nil), ContainerStatuses:[]v1.ContainerStatus(nil), QOSClass:"BestEffort", EphemeralContainerStatuses:[]v1.ContainerStatus(nil)}}


DONE 2486 tests, 4 skipped, 1 failure in 5.267s
+++ [0114 16:15:08] Saved JUnit XML test report to /logs/artifacts/junit_20200114-160406.xml
make[1]: *** [Makefile:185: test] Error 1
!!! [0114 16:15:08] Call tree:
!!! [0114 16:15:08]  1: hack/make-rules/test-integration.sh:97 runTests(...)
+++ [0114 16:15:08] Cleaning up etcd
+++ [0114 16:15:09] Integration test cleanup complete
make: *** [Makefile:204: test-integration] Error 1
+ EXIT_VALUE=2
+ set +o xtrace
Cleaning up after docker in docker.
================================================================================
[Barnacle] 2020/01/14 16:15:09 Cleaning up Docker data root...
[Barnacle] 2020/01/14 16:15:09 Removing all containers.
... skipping 12 lines ...