This job view page is being replaced by Spyglass soon. Check out the new job view.
PRyastij: Migrate scheduler to use v1beta1 Event API
ResultFAILURE
Tests 1 failed / 1714 succeeded
Started2019-07-12 08:35
Elapsed28m5s
Revision
Buildergke-prow-ssd-pool-1a225945-g951
Refs master:1bf7103e
78447:f3734dc3
podfa957ffd-a47f-11e9-a91c-4e77ce7c1900
infra-commit04c2406cc
podfa957ffd-a47f-11e9-a91c-4e77ce7c1900
repok8s.io/kubernetes
repo-commit065235cb4be1183037c75eb2f79305bb2ac1f42a
repos{u'k8s.io/kubernetes': u'master:1bf7103ea07f8fe0013e9dcc626dda0e01e26afd,78447:f3734dc32b62cb42a6ed983a542f14214528f5ef'}

Test Failures


k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/test/integration TestStatusSubresource 2.72s

go test -v k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/test/integration -run TestStatusSubresource$
I0712 08:57:46.424994  111376 crd_finalizer.go:267] Shutting down CRDFinalizer
I0712 08:57:46.425014  111376 nonstructuralschema_controller.go:203] Shutting down NonStructuralSchemaConditionController
I0712 08:57:46.425033  111376 customresource_discovery_controller.go:219] Shutting down DiscoveryController
I0712 08:57:46.425611  111376 serving.go:312] Generated self-signed cert (/tmp/apiextensions-apiserver623931729/apiserver.crt, /tmp/apiextensions-apiserver623931729/apiserver.key)
W0712 08:57:47.403992  111376 mutation_detector.go:48] Mutation detector is enabled, this will result in memory leakage.
I0712 08:57:47.405363  111376 client.go:354] parsed scheme: ""
I0712 08:57:47.405381  111376 client.go:354] scheme "" not registered, fallback to default scheme
I0712 08:57:47.405416  111376 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0712 08:57:47.405512  111376 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0712 08:57:47.405834  111376 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:57:47.407096  111376 mutation_detector.go:48] Mutation detector is enabled, this will result in memory leakage.
I0712 08:57:47.408721  111376 secure_serving.go:116] Serving securely on 127.0.0.1:41357
E0712 08:57:47.409284  111376 reflector.go:125] k8s.io/client-go/informers/factory.go:133: Failed to list *v1.Service: Get http://127.1.2.3:12345/api/v1/services?limit=500&resourceVersion=0: dial tcp 127.1.2.3:12345: connect: connection refused
I0712 08:57:47.409318  111376 crd_finalizer.go:255] Starting CRDFinalizer
I0712 08:57:47.409379  111376 customresource_discovery_controller.go:208] Starting DiscoveryController
I0712 08:57:47.409418  111376 naming_controller.go:288] Starting NamingConditionController
I0712 08:57:47.409434  111376 establishing_controller.go:73] Starting EstablishingController
I0712 08:57:47.409454  111376 nonstructuralschema_controller.go:191] Starting NonStructuralSchemaConditionController
I0712 08:57:47.425962  111376 client.go:354] parsed scheme: ""
I0712 08:57:47.425984  111376 client.go:354] scheme "" not registered, fallback to default scheme
I0712 08:57:47.426019  111376 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0712 08:57:47.426094  111376 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0712 08:57:47.426440  111376 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0712 08:57:48.027011  111376 client.go:354] parsed scheme: ""
I0712 08:57:48.027041  111376 client.go:354] scheme "" not registered, fallback to default scheme
I0712 08:57:48.027081  111376 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0712 08:57:48.027156  111376 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0712 08:57:48.027919  111376 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0712 08:57:48.029486  111376 client.go:354] parsed scheme: ""
I0712 08:57:48.029546  111376 client.go:354] scheme "" not registered, fallback to default scheme
I0712 08:57:48.029587  111376 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0712 08:57:48.029673  111376 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0712 08:57:48.030659  111376 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
E0712 08:57:48.410364  111376 reflector.go:125] k8s.io/client-go/informers/factory.go:133: Failed to list *v1.Service: Get http://127.1.2.3:12345/api/v1/services?limit=500&resourceVersion=0: dial tcp 127.1.2.3:12345: connect: connection refused
I0712 08:57:49.070688  111376 client.go:354] parsed scheme: ""
I0712 08:57:49.070721  111376 client.go:354] scheme "" not registered, fallback to default scheme
I0712 08:57:49.070757  111376 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0712 08:57:49.070815  111376 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0712 08:57:49.071713  111376 client.go:354] parsed scheme: ""
I0712 08:57:49.071729  111376 client.go:354] scheme "" not registered, fallback to default scheme
I0712 08:57:49.071757  111376 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0712 08:57:49.071797  111376 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0712 08:57:49.072121  111376 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0712 08:57:49.073268  111376 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:57:49.144474  111376 cacher.go:154] Terminating all watchers from cacher *apiextensions.CustomResourceDefinition
I0712 08:57:49.145158  111376 secure_serving.go:160] Stopped listening on 127.0.0.1:41357
testserver.go:141: runtime-config=map[api/all:true]
testserver.go:142: Starting apiextensions-apiserver on port 41357...
testserver.go:160: Waiting for /healthz to be ok...
subresources_test.go:194: unable to update status: Operation cannot be fulfilled on noxus.mygroup.example.com "foo": StorageError: invalid object, Code: 4, Key: /9f8f46c0-4b90-42ec-bf27-ffaa036750c5/mygroup.example.com/noxus/not-the-default/foo, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: e5100535-11c1-4461-a972-36e80fca0375, UID in object meta: 8c3aa698-5a39-432d-8810-3b43a31758de
				from junit_b08e264f3d2ff14dff3b873d155e207833096397_20190712-084950.xml

Filter through log files | View test history on testgrid


Show 1714 Passed Tests

Show 4 Skipped Tests

Error lines from build-log.txt

... skipping 323 lines ...
W0712 08:44:16.566] I0712 08:44:16.565273   48560 serving.go:312] Generated self-signed cert (/tmp/apiserver.crt, /tmp/apiserver.key)
W0712 08:44:16.566] I0712 08:44:16.565519   48560 server.go:560] external host was not specified, using 172.17.0.2
W0712 08:44:16.566] W0712 08:44:16.565547   48560 authentication.go:415] AnonymousAuth is not allowed with the AlwaysAllow authorizer. Resetting AnonymousAuth to false. You should use a different authorizer
W0712 08:44:16.567] I0712 08:44:16.566114   48560 server.go:149] Version: v1.16.0-alpha.0.2190+065235cb4be118
W0712 08:44:17.076] I0712 08:44:17.075525   48560 plugins.go:158] Loaded 4 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority.
W0712 08:44:17.076] I0712 08:44:17.075564   48560 plugins.go:161] Loaded 4 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ResourceQuota.
W0712 08:44:17.077] E0712 08:44:17.076002   48560 prometheus.go:55] failed to register depth metric admission_quota_controller: duplicate metrics collector registration attempted
W0712 08:44:17.077] E0712 08:44:17.076042   48560 prometheus.go:68] failed to register adds metric admission_quota_controller: duplicate metrics collector registration attempted
W0712 08:44:17.077] E0712 08:44:17.076071   48560 prometheus.go:82] failed to register latency metric admission_quota_controller: duplicate metrics collector registration attempted
W0712 08:44:17.077] E0712 08:44:17.076092   48560 prometheus.go:96] failed to register workDuration metric admission_quota_controller: duplicate metrics collector registration attempted
W0712 08:44:17.078] E0712 08:44:17.076116   48560 prometheus.go:112] failed to register unfinished metric admission_quota_controller: duplicate metrics collector registration attempted
W0712 08:44:17.078] E0712 08:44:17.076136   48560 prometheus.go:126] failed to register unfinished metric admission_quota_controller: duplicate metrics collector registration attempted
W0712 08:44:17.078] E0712 08:44:17.076162   48560 prometheus.go:152] failed to register depth metric admission_quota_controller: duplicate metrics collector registration attempted
W0712 08:44:17.078] E0712 08:44:17.076181   48560 prometheus.go:164] failed to register adds metric admission_quota_controller: duplicate metrics collector registration attempted
W0712 08:44:17.079] E0712 08:44:17.076237   48560 prometheus.go:176] failed to register latency metric admission_quota_controller: duplicate metrics collector registration attempted
W0712 08:44:17.079] E0712 08:44:17.076274   48560 prometheus.go:188] failed to register work_duration metric admission_quota_controller: duplicate metrics collector registration attempted
W0712 08:44:17.079] E0712 08:44:17.076299   48560 prometheus.go:203] failed to register unfinished_work_seconds metric admission_quota_controller: duplicate metrics collector registration attempted
W0712 08:44:17.079] E0712 08:44:17.076323   48560 prometheus.go:216] failed to register longest_running_processor_microseconds metric admission_quota_controller: duplicate metrics collector registration attempted
W0712 08:44:17.080] I0712 08:44:17.076349   48560 plugins.go:158] Loaded 4 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority.
W0712 08:44:17.080] I0712 08:44:17.076355   48560 plugins.go:161] Loaded 4 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ResourceQuota.
W0712 08:44:17.080] I0712 08:44:17.077901   48560 client.go:354] parsed scheme: ""
W0712 08:44:17.080] I0712 08:44:17.077939   48560 client.go:354] scheme "" not registered, fallback to default scheme
W0712 08:44:17.081] I0712 08:44:17.078001   48560 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
W0712 08:44:17.081] I0712 08:44:17.078201   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
... skipping 298 lines ...
W0712 08:44:17.451] W0712 08:44:17.451237   48560 genericapiserver.go:390] Skipping API node.k8s.io/v1alpha1 because it has no resources.
W0712 08:44:17.477] W0712 08:44:17.477269   48560 genericapiserver.go:390] Skipping API rbac.authorization.k8s.io/v1alpha1 because it has no resources.
W0712 08:44:17.487] W0712 08:44:17.487362   48560 genericapiserver.go:390] Skipping API scheduling.k8s.io/v1alpha1 because it has no resources.
W0712 08:44:17.504] W0712 08:44:17.504351   48560 genericapiserver.go:390] Skipping API storage.k8s.io/v1alpha1 because it has no resources.
W0712 08:44:17.525] W0712 08:44:17.524621   48560 genericapiserver.go:390] Skipping API apps/v1beta2 because it has no resources.
W0712 08:44:17.525] W0712 08:44:17.524660   48560 genericapiserver.go:390] Skipping API apps/v1beta1 because it has no resources.
W0712 08:44:17.535] E0712 08:44:17.534450   48560 prometheus.go:55] failed to register depth metric admission_quota_controller: duplicate metrics collector registration attempted
W0712 08:44:17.535] E0712 08:44:17.534496   48560 prometheus.go:68] failed to register adds metric admission_quota_controller: duplicate metrics collector registration attempted
W0712 08:44:17.535] E0712 08:44:17.534530   48560 prometheus.go:82] failed to register latency metric admission_quota_controller: duplicate metrics collector registration attempted
W0712 08:44:17.535] E0712 08:44:17.534547   48560 prometheus.go:96] failed to register workDuration metric admission_quota_controller: duplicate metrics collector registration attempted
W0712 08:44:17.536] E0712 08:44:17.534621   48560 prometheus.go:112] failed to register unfinished metric admission_quota_controller: duplicate metrics collector registration attempted
W0712 08:44:17.536] E0712 08:44:17.534658   48560 prometheus.go:126] failed to register unfinished metric admission_quota_controller: duplicate metrics collector registration attempted
W0712 08:44:17.536] E0712 08:44:17.534670   48560 prometheus.go:152] failed to register depth metric admission_quota_controller: duplicate metrics collector registration attempted
W0712 08:44:17.536] E0712 08:44:17.534681   48560 prometheus.go:164] failed to register adds metric admission_quota_controller: duplicate metrics collector registration attempted
W0712 08:44:17.537] E0712 08:44:17.534719   48560 prometheus.go:176] failed to register latency metric admission_quota_controller: duplicate metrics collector registration attempted
W0712 08:44:17.537] E0712 08:44:17.534749   48560 prometheus.go:188] failed to register work_duration metric admission_quota_controller: duplicate metrics collector registration attempted
W0712 08:44:17.537] E0712 08:44:17.534764   48560 prometheus.go:203] failed to register unfinished_work_seconds metric admission_quota_controller: duplicate metrics collector registration attempted
W0712 08:44:17.538] E0712 08:44:17.534777   48560 prometheus.go:216] failed to register longest_running_processor_microseconds metric admission_quota_controller: duplicate metrics collector registration attempted
W0712 08:44:17.538] I0712 08:44:17.534804   48560 plugins.go:158] Loaded 4 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority.
W0712 08:44:17.539] I0712 08:44:17.534810   48560 plugins.go:161] Loaded 4 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ResourceQuota.
W0712 08:44:17.539] I0712 08:44:17.536103   48560 client.go:354] parsed scheme: ""
W0712 08:44:17.539] I0712 08:44:17.536128   48560 client.go:354] scheme "" not registered, fallback to default scheme
W0712 08:44:17.539] I0712 08:44:17.536163   48560 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
W0712 08:44:17.539] I0712 08:44:17.536199   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
... skipping 89 lines ...
W0712 08:44:52.817] I0712 08:44:52.816276   51895 controller_utils.go:1029] Waiting for caches to sync for persistent volume controller
W0712 08:44:52.817] I0712 08:44:52.816866   51895 controllermanager.go:534] Started "persistentvolume-expander"
W0712 08:44:52.817] W0712 08:44:52.816901   51895 controllermanager.go:526] Skipping "ttl-after-finished"
W0712 08:44:52.817] I0712 08:44:52.817065   51895 expand_controller.go:300] Starting expand controller
W0712 08:44:52.817] I0712 08:44:52.817137   51895 controller_utils.go:1029] Waiting for caches to sync for expand controller
W0712 08:44:52.817] I0712 08:44:52.817271   51895 node_lifecycle_controller.go:77] Sending events to api server
W0712 08:44:52.818] E0712 08:44:52.817329   51895 core.go:175] failed to start cloud node lifecycle controller: no cloud provider provided
W0712 08:44:52.818] W0712 08:44:52.817342   51895 controllermanager.go:526] Skipping "cloud-node-lifecycle"
W0712 08:44:52.818] W0712 08:44:52.818010   51895 probe.go:268] Flexvolume plugin directory at /usr/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating.
W0712 08:44:52.819] I0712 08:44:52.818848   51895 controllermanager.go:534] Started "attachdetach"
W0712 08:44:52.819] I0712 08:44:52.819017   51895 attach_detach_controller.go:335] Starting attach detach controller
W0712 08:44:52.819] I0712 08:44:52.819044   51895 controller_utils.go:1029] Waiting for caches to sync for attach detach controller
W0712 08:44:52.826] I0712 08:44:52.825648   51895 controllermanager.go:534] Started "namespace"
... skipping 38 lines ...
W0712 08:44:52.843] I0712 08:44:52.843077   51895 controllermanager.go:534] Started "replicationcontroller"
W0712 08:44:52.843] I0712 08:44:52.843197   51895 replica_set.go:182] Starting replicationcontroller controller
W0712 08:44:52.843] I0712 08:44:52.843218   51895 controller_utils.go:1029] Waiting for caches to sync for ReplicationController controller
W0712 08:44:52.844] I0712 08:44:52.843999   51895 controllermanager.go:534] Started "daemonset"
W0712 08:44:52.844] I0712 08:44:52.844095   51895 daemon_controller.go:267] Starting daemon sets controller
W0712 08:44:52.844] I0712 08:44:52.844114   51895 controller_utils.go:1029] Waiting for caches to sync for daemon sets controller
W0712 08:44:52.845] E0712 08:44:52.844814   51895 core.go:78] Failed to start service controller: WARNING: no cloud provider provided, services of type LoadBalancer will fail
W0712 08:44:52.845] W0712 08:44:52.844835   51895 controllermanager.go:526] Skipping "service"
W0712 08:44:52.845] I0712 08:44:52.845457   51895 controllermanager.go:534] Started "clusterrole-aggregation"
W0712 08:44:52.845] I0712 08:44:52.845479   51895 clusterroleaggregation_controller.go:148] Starting ClusterRoleAggregator
W0712 08:44:52.846] I0712 08:44:52.845499   51895 controller_utils.go:1029] Waiting for caches to sync for ClusterRoleAggregator controller
W0712 08:44:52.846] I0712 08:44:52.845995   51895 controllermanager.go:534] Started "pv-protection"
W0712 08:44:52.846] I0712 08:44:52.846119   51895 pv_protection_controller.go:82] Starting PV protection controller
... skipping 25 lines ...
W0712 08:44:53.005] I0712 08:44:53.003457   51895 resource_quota_controller.go:271] Starting resource quota controller
W0712 08:44:53.005] I0712 08:44:53.003493   51895 controller_utils.go:1029] Waiting for caches to sync for resource quota controller
W0712 08:44:53.006] I0712 08:44:53.003518   51895 resource_quota_monitor.go:303] QuotaMonitor running
W0712 08:44:53.022] I0712 08:44:53.021589   51895 controller_utils.go:1036] Caches are synced for service account controller
W0712 08:44:53.024] I0712 08:44:53.024041   48560 controller.go:606] quota admission added evaluator for: serviceaccounts
W0712 08:44:53.026] I0712 08:44:53.026153   51895 controller_utils.go:1036] Caches are synced for namespace controller
W0712 08:44:53.033] W0712 08:44:53.033060   51895 actual_state_of_world.go:506] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="127.0.0.1" does not exist
W0712 08:44:53.036] I0712 08:44:53.035962   51895 controller_utils.go:1036] Caches are synced for GC controller
W0712 08:44:53.039] I0712 08:44:53.039301   51895 controller_utils.go:1036] Caches are synced for HPA controller
W0712 08:44:53.040] I0712 08:44:53.039753   51895 controller_utils.go:1036] Caches are synced for certificate controller
W0712 08:44:53.040] I0712 08:44:53.040146   51895 controller_utils.go:1036] Caches are synced for PVC protection controller
W0712 08:44:53.043] I0712 08:44:53.043625   51895 controller_utils.go:1036] Caches are synced for ReplicationController controller
W0712 08:44:53.046] I0712 08:44:53.045694   51895 controller_utils.go:1036] Caches are synced for ClusterRoleAggregator controller
W0712 08:44:53.046] I0712 08:44:53.046323   51895 controller_utils.go:1036] Caches are synced for PV protection controller
W0712 08:44:53.059] E0712 08:44:53.058934   51895 clusterroleaggregation_controller.go:180] view failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "view": the object has been modified; please apply your changes to the latest version and try again
W0712 08:44:53.059] E0712 08:44:53.059207   51895 clusterroleaggregation_controller.go:180] edit failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "edit": the object has been modified; please apply your changes to the latest version and try again
W0712 08:44:53.116] I0712 08:44:53.116129   51895 controller_utils.go:1036] Caches are synced for stateful set controller
W0712 08:44:53.117] I0712 08:44:53.116128   51895 controller_utils.go:1036] Caches are synced for TTL controller
W0712 08:44:53.117] I0712 08:44:53.116517   51895 controller_utils.go:1036] Caches are synced for persistent volume controller
W0712 08:44:53.117] I0712 08:44:53.117318   51895 controller_utils.go:1036] Caches are synced for expand controller
I0712 08:44:53.218] NAME         TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
I0712 08:44:53.218] kubernetes   ClusterIP   10.0.0.1     <none>        443/TCP   30s
... skipping 95 lines ...
I0712 08:44:56.373] +++ working dir: /go/src/k8s.io/kubernetes
I0712 08:44:56.375] +++ command: run_RESTMapper_evaluation_tests
I0712 08:44:56.385] +++ [0712 08:44:56] Creating namespace namespace-1562921096-19956
I0712 08:44:56.447] namespace/namespace-1562921096-19956 created
I0712 08:44:56.503] Context "test" modified.
I0712 08:44:56.508] +++ [0712 08:44:56] Testing RESTMapper
I0712 08:44:56.599] +++ [0712 08:44:56] "kubectl get unknownresourcetype" returns error as expected: error: the server doesn't have a resource type "unknownresourcetype"
I0712 08:44:56.611] +++ exit code: 0
I0712 08:44:56.719] NAME                              SHORTNAMES   APIGROUP                       NAMESPACED   KIND
I0712 08:44:56.719] bindings                                                                      true         Binding
I0712 08:44:56.719] componentstatuses                 cs                                          false        ComponentStatus
I0712 08:44:56.720] configmaps                        cm                                          true         ConfigMap
I0712 08:44:56.720] endpoints                         ep                                          true         Endpoints
... skipping 664 lines ...
I0712 08:45:14.668] poddisruptionbudget.policy/test-pdb-3 created
I0712 08:45:14.762] core.sh:251: Successful get pdb/test-pdb-3 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 2
I0712 08:45:14.834] poddisruptionbudget.policy/test-pdb-4 created
I0712 08:45:14.930] core.sh:255: Successful get pdb/test-pdb-4 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 50%
I0712 08:45:15.089] core.sh:261: Successful get pods --namespace=test-kubectl-describe-pod {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:45:15.275] pod/env-test-pod created
W0712 08:45:15.376] error: resource(s) were provided, but no name, label selector, or --all flag specified
W0712 08:45:15.376] error: setting 'all' parameter but found a non empty selector. 
W0712 08:45:15.377] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0712 08:45:15.377] I0712 08:45:14.333319   48560 controller.go:606] quota admission added evaluator for: poddisruptionbudgets.policy
W0712 08:45:15.377] error: min-available and max-unavailable cannot be both specified
I0712 08:45:15.478] core.sh:264: Successful describe pods --namespace=test-kubectl-describe-pod env-test-pod:
I0712 08:45:15.478] Name:         env-test-pod
I0712 08:45:15.478] Namespace:    test-kubectl-describe-pod
I0712 08:45:15.478] Priority:     0
I0712 08:45:15.479] Node:         <none>
I0712 08:45:15.479] Labels:       <none>
... skipping 173 lines ...
I0712 08:45:28.739] pod/valid-pod patched
I0712 08:45:28.838] core.sh:470: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: changed-with-yaml:
I0712 08:45:28.910] pod/valid-pod patched
I0712 08:45:28.999] core.sh:475: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:3.1:
I0712 08:45:29.150] pod/valid-pod patched
I0712 08:45:29.247] core.sh:491: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
I0712 08:45:29.435] +++ [0712 08:45:29] "kubectl patch with resourceVersion 494" returns error as expected: Error from server (Conflict): Operation cannot be fulfilled on pods "valid-pod": the object has been modified; please apply your changes to the latest version and try again
I0712 08:45:29.664] pod "valid-pod" deleted
I0712 08:45:29.678] pod/valid-pod replaced
I0712 08:45:29.773] core.sh:515: Successful get pod valid-pod {{(index .spec.containers 0).name}}: replaced-k8s-serve-hostname
I0712 08:45:29.925] Successful
I0712 08:45:29.925] message:error: --grace-period must have --force specified
I0712 08:45:29.926] has:\-\-grace-period must have \-\-force specified
I0712 08:45:30.082] Successful
I0712 08:45:30.083] message:error: --timeout must have --force specified
I0712 08:45:30.083] has:\-\-timeout must have \-\-force specified
I0712 08:45:30.228] node/node-v1-test created
W0712 08:45:30.329] W0712 08:45:30.228794   51895 actual_state_of_world.go:506] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="node-v1-test" does not exist
I0712 08:45:30.430] node/node-v1-test replaced
I0712 08:45:30.469] core.sh:552: Successful get node node-v1-test {{.metadata.annotations.a}}: b
I0712 08:45:30.544] node "node-v1-test" deleted
I0712 08:45:30.637] core.sh:559: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
I0712 08:45:30.907] core.sh:562: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/serve_hostname:
I0712 08:45:31.876] core.sh:575: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
... skipping 66 lines ...
I0712 08:45:35.918] save-config.sh:31: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:45:36.075] pod/test-pod created
W0712 08:45:36.176] Edit cancelled, no changes made.
W0712 08:45:36.176] Edit cancelled, no changes made.
W0712 08:45:36.177] Edit cancelled, no changes made.
W0712 08:45:36.177] Edit cancelled, no changes made.
W0712 08:45:36.177] error: 'name' already has a value (valid-pod), and --overwrite is false
W0712 08:45:36.177] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0712 08:45:36.178] Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
I0712 08:45:36.278] pod "test-pod" deleted
I0712 08:45:36.279] +++ [0712 08:45:36] Creating namespace namespace-1562921136-21995
I0712 08:45:36.329] namespace/namespace-1562921136-21995 created
I0712 08:45:36.401] Context "test" modified.
... skipping 41 lines ...
I0712 08:45:39.413] +++ Running case: test-cmd.run_kubectl_create_error_tests 
I0712 08:45:39.416] +++ working dir: /go/src/k8s.io/kubernetes
I0712 08:45:39.418] +++ command: run_kubectl_create_error_tests
I0712 08:45:39.430] +++ [0712 08:45:39] Creating namespace namespace-1562921139-7033
I0712 08:45:39.499] namespace/namespace-1562921139-7033 created
I0712 08:45:39.573] Context "test" modified.
I0712 08:45:39.580] +++ [0712 08:45:39] Testing kubectl create with error
W0712 08:45:39.680] Error: must specify one of -f and -k
W0712 08:45:39.681] 
W0712 08:45:39.681] Create a resource from a file or from stdin.
W0712 08:45:39.681] 
W0712 08:45:39.681]  JSON and YAML formats are accepted.
W0712 08:45:39.681] 
W0712 08:45:39.681] Examples:
... skipping 41 lines ...
W0712 08:45:39.687] 
W0712 08:45:39.687] Usage:
W0712 08:45:39.687]   kubectl create -f FILENAME [options]
W0712 08:45:39.687] 
W0712 08:45:39.688] Use "kubectl <command> --help" for more information about a given command.
W0712 08:45:39.688] Use "kubectl options" for a list of global command-line options (applies to all commands).
I0712 08:45:39.800] +++ [0712 08:45:39] "kubectl create with empty string list returns error as expected: error: error validating "hack/testdata/invalid-rc-with-empty-args.yaml": error validating data: ValidationError(ReplicationController.spec.template.spec.containers[0].args): unknown object type "nil" in ReplicationController.spec.template.spec.containers[0].args[0]; if you choose to ignore these errors, turn validation off with --validate=false
W0712 08:45:39.901] kubectl convert is DEPRECATED and will be removed in a future version.
W0712 08:45:39.901] In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
I0712 08:45:40.001] +++ exit code: 0
I0712 08:45:40.003] Recording: run_kubectl_apply_tests
I0712 08:45:40.004] Running command: run_kubectl_apply_tests
I0712 08:45:40.025] 
... skipping 19 lines ...
W0712 08:45:42.056] I0712 08:45:42.055492   48560 client.go:354] parsed scheme: ""
W0712 08:45:42.056] I0712 08:45:42.055530   48560 client.go:354] scheme "" not registered, fallback to default scheme
W0712 08:45:42.056] I0712 08:45:42.055588   48560 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
W0712 08:45:42.056] I0712 08:45:42.055711   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:45:42.057] I0712 08:45:42.056384   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:45:42.059] I0712 08:45:42.059430   48560 controller.go:606] quota admission added evaluator for: resources.mygroup.example.com
W0712 08:45:42.142] Error from server (NotFound): resources.mygroup.example.com "myobj" not found
I0712 08:45:42.243] kind.mygroup.example.com/myobj serverside-applied (server dry run)
I0712 08:45:42.243] customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
I0712 08:45:42.244] +++ exit code: 0
I0712 08:45:42.272] Recording: run_kubectl_run_tests
I0712 08:45:42.272] Running command: run_kubectl_run_tests
I0712 08:45:42.290] 
... skipping 95 lines ...
I0712 08:45:44.508] Context "test" modified.
I0712 08:45:44.514] +++ [0712 08:45:44] Testing kubectl create filter
I0712 08:45:44.596] create.sh:30: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:45:44.741] pod/selector-test-pod created
I0712 08:45:44.826] create.sh:34: Successful get pods selector-test-pod {{.metadata.labels.name}}: selector-test-pod
I0712 08:45:44.910] Successful
I0712 08:45:44.911] message:Error from server (NotFound): pods "selector-test-pod-dont-apply" not found
I0712 08:45:44.911] has:pods "selector-test-pod-dont-apply" not found
I0712 08:45:44.979] pod "selector-test-pod" deleted
I0712 08:45:44.996] +++ exit code: 0
I0712 08:45:45.029] Recording: run_kubectl_apply_deployments_tests
I0712 08:45:45.030] Running command: run_kubectl_apply_deployments_tests
I0712 08:45:45.047] 
... skipping 27 lines ...
I0712 08:45:46.638] apps.sh:139: Successful get replicasets {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:45:46.710] apps.sh:140: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:45:46.794] apps.sh:144: Successful get deployments {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:45:46.949] deployment.apps/nginx created
I0712 08:45:47.047] apps.sh:148: Successful get deployment nginx {{.metadata.name}}: nginx
I0712 08:45:51.290] Successful
I0712 08:45:51.291] message:Error from server (Conflict): error when applying patch:
I0712 08:45:51.291] {"metadata":{"annotations":{"kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1562921145-31062\",\"resourceVersion\":\"99\"},\"spec\":{\"replicas\":3,\"selector\":{\"matchLabels\":{\"name\":\"nginx2\"}},\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx2\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n"},"resourceVersion":"99"},"spec":{"selector":{"matchLabels":{"name":"nginx2"}},"template":{"metadata":{"labels":{"name":"nginx2"}}}}}
I0712 08:45:51.291] to:
I0712 08:45:51.292] Resource: "apps/v1, Resource=deployments", GroupVersionKind: "apps/v1, Kind=Deployment"
I0712 08:45:51.292] Name: "nginx", Namespace: "namespace-1562921145-31062"
I0712 08:45:51.295] Object: &{map["apiVersion":"apps/v1" "kind":"Deployment" "metadata":map["annotations":map["deployment.kubernetes.io/revision":"1" "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1562921145-31062\"},\"spec\":{\"replicas\":3,\"selector\":{\"matchLabels\":{\"name\":\"nginx1\"}},\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx1\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n"] "creationTimestamp":"2019-07-12T08:45:46Z" "generation":'\x01' "labels":map["name":"nginx"] "managedFields":[map["apiVersion":"apps/v1" "fields":map["f:metadata":map["f:annotations":map["f:deployment.kubernetes.io/revision":map[]]] "f:status":map["f:conditions":map[".":map[] "k:{\"type\":\"Available\"}":map[".":map[] "f:lastTransitionTime":map[] "f:lastUpdateTime":map[] "f:message":map[] "f:reason":map[] "f:status":map[] "f:type":map[]] "k:{\"type\":\"Progressing\"}":map[".":map[] "f:lastTransitionTime":map[] "f:lastUpdateTime":map[] "f:message":map[] "f:reason":map[] "f:status":map[] "f:type":map[]]] "f:observedGeneration":map[] "f:replicas":map[] "f:unavailableReplicas":map[] "f:updatedReplicas":map[]]] "manager":"kube-controller-manager" "operation":"Update" "time":"2019-07-12T08:45:46Z"] map["apiVersion":"apps/v1" "fields":map["f:metadata":map["f:annotations":map[".":map[] "f:kubectl.kubernetes.io/last-applied-configuration":map[]] "f:labels":map[".":map[] "f:name":map[]]] "f:spec":map["f:progressDeadlineSeconds":map[] "f:replicas":map[] "f:revisionHistoryLimit":map[] "f:selector":map["f:matchLabels":map[".":map[] "f:name":map[]]] "f:strategy":map["f:rollingUpdate":map[".":map[] "f:maxSurge":map[] "f:maxUnavailable":map[]] "f:type":map[]] "f:template":map["f:metadata":map["f:labels":map[".":map[] "f:name":map[]]] "f:spec":map["f:containers":map["k:{\"name\":\"nginx\"}":map[".":map[] "f:image":map[] "f:imagePullPolicy":map[] "f:name":map[] "f:ports":map[".":map[] "k:{\"containerPort\":80,\"protocol\":\"TCP\"}":map[".":map[] "f:containerPort":map[] "f:protocol":map[]]] "f:resources":map[] "f:terminationMessagePath":map[] "f:terminationMessagePolicy":map[]]] "f:dnsPolicy":map[] "f:restartPolicy":map[] "f:schedulerName":map[] "f:securityContext":map[] "f:terminationGracePeriodSeconds":map[]]]]] "manager":"kubectl" "operation":"Update" "time":"2019-07-12T08:45:46Z"]] "name":"nginx" "namespace":"namespace-1562921145-31062" "resourceVersion":"585" "selfLink":"/apis/apps/v1/namespaces/namespace-1562921145-31062/deployments/nginx" "uid":"3041606a-cccc-494e-959b-970ad060de73"] "spec":map["progressDeadlineSeconds":'\u0258' "replicas":'\x03' "revisionHistoryLimit":'\n' "selector":map["matchLabels":map["name":"nginx1"]] "strategy":map["rollingUpdate":map["maxSurge":"25%" "maxUnavailable":"25%"] "type":"RollingUpdate"] "template":map["metadata":map["creationTimestamp":<nil> "labels":map["name":"nginx1"]] "spec":map["containers":[map["image":"k8s.gcr.io/nginx:test-cmd" "imagePullPolicy":"IfNotPresent" "name":"nginx" "ports":[map["containerPort":'P' "protocol":"TCP"]] "resources":map[] "terminationMessagePath":"/dev/termination-log" "terminationMessagePolicy":"File"]] "dnsPolicy":"ClusterFirst" "restartPolicy":"Always" "schedulerName":"default-scheduler" "securityContext":map[] "terminationGracePeriodSeconds":'\x1e']]] "status":map["conditions":[map["lastTransitionTime":"2019-07-12T08:45:46Z" "lastUpdateTime":"2019-07-12T08:45:46Z" "message":"Deployment does not have minimum availability." "reason":"MinimumReplicasUnavailable" "status":"False" "type":"Available"] map["lastTransitionTime":"2019-07-12T08:45:46Z" "lastUpdateTime":"2019-07-12T08:45:46Z" "message":"ReplicaSet \"nginx-55474bb66f\" is progressing." "reason":"ReplicaSetUpdated" "status":"True" "type":"Progressing"]] "observedGeneration":'\x01' "replicas":'\x03' "unavailableReplicas":'\x03' "updatedReplicas":'\x03']]}
I0712 08:45:51.295] for: "hack/testdata/deployment-label-change2.yaml": Operation cannot be fulfilled on deployments.apps "nginx": the object has been modified; please apply your changes to the latest version and try again
I0712 08:45:51.295] has:Error from server (Conflict)
W0712 08:45:51.396] I0712 08:45:46.956005   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921145-31062", Name:"nginx", UID:"3041606a-cccc-494e-959b-970ad060de73", APIVersion:"apps/v1", ResourceVersion:"572", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-55474bb66f to 3
W0712 08:45:51.396] I0712 08:45:46.960600   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921145-31062", Name:"nginx-55474bb66f", UID:"4c3b3579-3c9e-4ab9-bd87-d5cab451c2a9", APIVersion:"apps/v1", ResourceVersion:"573", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-55474bb66f-2v55l
W0712 08:45:51.396] I0712 08:45:46.964801   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921145-31062", Name:"nginx-55474bb66f", UID:"4c3b3579-3c9e-4ab9-bd87-d5cab451c2a9", APIVersion:"apps/v1", ResourceVersion:"573", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-55474bb66f-sqm59
W0712 08:45:51.397] I0712 08:45:46.966834   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921145-31062", Name:"nginx-55474bb66f", UID:"4c3b3579-3c9e-4ab9-bd87-d5cab451c2a9", APIVersion:"apps/v1", ResourceVersion:"573", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-55474bb66f-6qrq9
W0712 08:45:53.862] I0712 08:45:53.861719   51895 horizontal.go:341] Horizontal Pod Autoscaler frontend has been deleted in namespace-1562921137-20430
W0712 08:45:55.699] E0712 08:45:55.698326   51895 replica_set.go:450] Sync "namespace-1562921145-31062/nginx-55474bb66f" failed with Operation cannot be fulfilled on replicasets.apps "nginx-55474bb66f": StorageError: invalid object, Code: 4, Key: /registry/replicasets/namespace-1562921145-31062/nginx-55474bb66f, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 4c3b3579-3c9e-4ab9-bd87-d5cab451c2a9, UID in object meta: 
I0712 08:45:56.518] deployment.apps/nginx configured
I0712 08:45:56.615] Successful
I0712 08:45:56.616] message:        "name": "nginx2"
I0712 08:45:56.616]           "name": "nginx2"
I0712 08:45:56.616] has:"name": "nginx2"
W0712 08:45:56.716] I0712 08:45:56.522983   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921145-31062", Name:"nginx", UID:"04113cc5-0fa3-4179-a307-6e263d68e6eb", APIVersion:"apps/v1", ResourceVersion:"608", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-56bb4894f8 to 3
... skipping 168 lines ...
I0712 08:46:03.764] +++ [0712 08:46:03] Creating namespace namespace-1562921163-10243
I0712 08:46:03.833] namespace/namespace-1562921163-10243 created
I0712 08:46:03.899] Context "test" modified.
I0712 08:46:03.906] +++ [0712 08:46:03] Testing kubectl get
I0712 08:46:03.994] get.sh:29: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:46:04.076] Successful
I0712 08:46:04.076] message:Error from server (NotFound): pods "abc" not found
I0712 08:46:04.077] has:pods "abc" not found
I0712 08:46:04.162] get.sh:37: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:46:04.246] Successful
I0712 08:46:04.247] message:Error from server (NotFound): pods "abc" not found
I0712 08:46:04.247] has:pods "abc" not found
I0712 08:46:04.334] get.sh:45: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:46:04.416] Successful
I0712 08:46:04.416] message:{
I0712 08:46:04.416]     "apiVersion": "v1",
I0712 08:46:04.416]     "items": [],
... skipping 23 lines ...
I0712 08:46:04.735] has not:No resources found
I0712 08:46:04.815] Successful
I0712 08:46:04.815] message:NAME
I0712 08:46:04.816] has not:No resources found
I0712 08:46:04.901] get.sh:73: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:46:04.992] Successful
I0712 08:46:04.992] message:error: the server doesn't have a resource type "foobar"
I0712 08:46:04.993] has not:No resources found
I0712 08:46:05.073] Successful
I0712 08:46:05.073] message:No resources found in namespace-1562921163-10243 namespace.
I0712 08:46:05.073] has:No resources found
I0712 08:46:05.154] Successful
I0712 08:46:05.154] message:
I0712 08:46:05.154] has not:No resources found
I0712 08:46:05.235] Successful
I0712 08:46:05.235] message:No resources found in namespace-1562921163-10243 namespace.
I0712 08:46:05.235] has:No resources found
I0712 08:46:05.326] get.sh:93: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:46:05.410] Successful
I0712 08:46:05.410] message:Error from server (NotFound): pods "abc" not found
I0712 08:46:05.411] has:pods "abc" not found
I0712 08:46:05.411] FAIL!
I0712 08:46:05.411] message:Error from server (NotFound): pods "abc" not found
I0712 08:46:05.412] has not:List
I0712 08:46:05.412] 99 /go/src/k8s.io/kubernetes/test/cmd/../../test/cmd/get.sh
I0712 08:46:05.524] Successful
I0712 08:46:05.524] message:I0712 08:46:05.479209   62466 loader.go:375] Config loaded from file:  /tmp/tmp.K30J9Ddv31/.kube/config
I0712 08:46:05.524] I0712 08:46:05.480731   62466 round_trippers.go:438] GET http://127.0.0.1:8080/version?timeout=32s 200 OK in 1 milliseconds
I0712 08:46:05.524] I0712 08:46:05.500055   62466 round_trippers.go:438] GET http://127.0.0.1:8080/api/v1/namespaces/default/pods 200 OK in 2 milliseconds
... skipping 684 lines ...
I0712 08:46:11.013] Successful
I0712 08:46:11.013] message:NAME    DATA   AGE
I0712 08:46:11.013] one     0      1s
I0712 08:46:11.013] three   0      1s
I0712 08:46:11.013] two     0      1s
I0712 08:46:11.014] STATUS    REASON          MESSAGE
I0712 08:46:11.014] Failure   InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
I0712 08:46:11.014] has not:watch is only supported on individual resources
I0712 08:46:12.097] Successful
I0712 08:46:12.097] message:STATUS    REASON          MESSAGE
I0712 08:46:12.097] Failure   InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
I0712 08:46:12.097] has not:watch is only supported on individual resources
I0712 08:46:12.102] +++ [0712 08:46:12] Creating namespace namespace-1562921172-29763
I0712 08:46:12.174] namespace/namespace-1562921172-29763 created
I0712 08:46:12.242] Context "test" modified.
I0712 08:46:12.333] get.sh:157: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:46:12.472] pod/valid-pod created
... skipping 104 lines ...
I0712 08:46:12.558] }
I0712 08:46:12.634] get.sh:162: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0712 08:46:12.877] <no value>Successful
I0712 08:46:12.877] message:valid-pod:
I0712 08:46:12.877] has:valid-pod:
I0712 08:46:12.959] Successful
I0712 08:46:12.960] message:error: error executing jsonpath "{.missing}": Error executing template: missing is not found. Printing more information for debugging the template:
I0712 08:46:12.960] 	template was:
I0712 08:46:12.960] 		{.missing}
I0712 08:46:12.960] 	object given to jsonpath engine was:
I0712 08:46:12.962] 		map[string]interface {}{"apiVersion":"v1", "kind":"Pod", "metadata":map[string]interface {}{"creationTimestamp":"2019-07-12T08:46:12Z", "labels":map[string]interface {}{"name":"valid-pod"}, "managedFields":[]interface {}{map[string]interface {}{"apiVersion":"v1", "fields":map[string]interface {}{"f:metadata":map[string]interface {}{"f:labels":map[string]interface {}{".":map[string]interface {}{}, "f:name":map[string]interface {}{}}}, "f:spec":map[string]interface {}{"f:containers":map[string]interface {}{"k:{\"name\":\"kubernetes-serve-hostname\"}":map[string]interface {}{".":map[string]interface {}{}, "f:image":map[string]interface {}{}, "f:imagePullPolicy":map[string]interface {}{}, "f:name":map[string]interface {}{}, "f:resources":map[string]interface {}{".":map[string]interface {}{}, "f:limits":map[string]interface {}{".":map[string]interface {}{}, "f:cpu":map[string]interface {}{}, "f:memory":map[string]interface {}{}}, "f:requests":map[string]interface {}{".":map[string]interface {}{}, "f:cpu":map[string]interface {}{}, "f:memory":map[string]interface {}{}}}, "f:terminationMessagePath":map[string]interface {}{}, "f:terminationMessagePolicy":map[string]interface {}{}}}, "f:dnsPolicy":map[string]interface {}{}, "f:enableServiceLinks":map[string]interface {}{}, "f:priority":map[string]interface {}{}, "f:restartPolicy":map[string]interface {}{}, "f:schedulerName":map[string]interface {}{}, "f:securityContext":map[string]interface {}{}, "f:terminationGracePeriodSeconds":map[string]interface {}{}}}, "manager":"kubectl", "operation":"Update", "time":"2019-07-12T08:46:12Z"}}, "name":"valid-pod", "namespace":"namespace-1562921172-29763", "resourceVersion":"684", "selfLink":"/api/v1/namespaces/namespace-1562921172-29763/pods/valid-pod", "uid":"83502b89-03f2-4ace-8289-dae4700139e2"}, "spec":map[string]interface {}{"containers":[]interface {}{map[string]interface {}{"image":"k8s.gcr.io/serve_hostname", "imagePullPolicy":"Always", "name":"kubernetes-serve-hostname", "resources":map[string]interface {}{"limits":map[string]interface {}{"cpu":"1", "memory":"512Mi"}, "requests":map[string]interface {}{"cpu":"1", "memory":"512Mi"}}, "terminationMessagePath":"/dev/termination-log", "terminationMessagePolicy":"File"}}, "dnsPolicy":"ClusterFirst", "enableServiceLinks":true, "priority":0, "restartPolicy":"Always", "schedulerName":"default-scheduler", "securityContext":map[string]interface {}{}, "terminationGracePeriodSeconds":30}, "status":map[string]interface {}{"phase":"Pending", "qosClass":"Guaranteed"}}
I0712 08:46:12.962] has:missing is not found
I0712 08:46:13.038] Successful
I0712 08:46:13.038] message:Error executing template: template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing". Printing more information for debugging the template:
I0712 08:46:13.038] 	template was:
I0712 08:46:13.038] 		{{.missing}}
I0712 08:46:13.038] 	raw data was:
I0712 08:46:13.040] 		{"apiVersion":"v1","kind":"Pod","metadata":{"creationTimestamp":"2019-07-12T08:46:12Z","labels":{"name":"valid-pod"},"managedFields":[{"apiVersion":"v1","fields":{"f:metadata":{"f:labels":{".":{},"f:name":{}}},"f:spec":{"f:containers":{"k:{\"name\":\"kubernetes-serve-hostname\"}":{".":{},"f:image":{},"f:imagePullPolicy":{},"f:name":{},"f:resources":{".":{},"f:limits":{".":{},"f:cpu":{},"f:memory":{}},"f:requests":{".":{},"f:cpu":{},"f:memory":{}}},"f:terminationMessagePath":{},"f:terminationMessagePolicy":{}}},"f:dnsPolicy":{},"f:enableServiceLinks":{},"f:priority":{},"f:restartPolicy":{},"f:schedulerName":{},"f:securityContext":{},"f:terminationGracePeriodSeconds":{}}},"manager":"kubectl","operation":"Update","time":"2019-07-12T08:46:12Z"}],"name":"valid-pod","namespace":"namespace-1562921172-29763","resourceVersion":"684","selfLink":"/api/v1/namespaces/namespace-1562921172-29763/pods/valid-pod","uid":"83502b89-03f2-4ace-8289-dae4700139e2"},"spec":{"containers":[{"image":"k8s.gcr.io/serve_hostname","imagePullPolicy":"Always","name":"kubernetes-serve-hostname","resources":{"limits":{"cpu":"1","memory":"512Mi"},"requests":{"cpu":"1","memory":"512Mi"}},"terminationMessagePath":"/dev/termination-log","terminationMessagePolicy":"File"}],"dnsPolicy":"ClusterFirst","enableServiceLinks":true,"priority":0,"restartPolicy":"Always","schedulerName":"default-scheduler","securityContext":{},"terminationGracePeriodSeconds":30},"status":{"phase":"Pending","qosClass":"Guaranteed"}}
I0712 08:46:13.040] 	object given to template engine was:
I0712 08:46:13.041] 		map[apiVersion:v1 kind:Pod metadata:map[creationTimestamp:2019-07-12T08:46:12Z labels:map[name:valid-pod] managedFields:[map[apiVersion:v1 fields:map[f:metadata:map[f:labels:map[.:map[] f:name:map[]]] f:spec:map[f:containers:map[k:{"name":"kubernetes-serve-hostname"}:map[.:map[] f:image:map[] f:imagePullPolicy:map[] f:name:map[] f:resources:map[.:map[] f:limits:map[.:map[] f:cpu:map[] f:memory:map[]] f:requests:map[.:map[] f:cpu:map[] f:memory:map[]]] f:terminationMessagePath:map[] f:terminationMessagePolicy:map[]]] f:dnsPolicy:map[] f:enableServiceLinks:map[] f:priority:map[] f:restartPolicy:map[] f:schedulerName:map[] f:securityContext:map[] f:terminationGracePeriodSeconds:map[]]] manager:kubectl operation:Update time:2019-07-12T08:46:12Z]] name:valid-pod namespace:namespace-1562921172-29763 resourceVersion:684 selfLink:/api/v1/namespaces/namespace-1562921172-29763/pods/valid-pod uid:83502b89-03f2-4ace-8289-dae4700139e2] spec:map[containers:[map[image:k8s.gcr.io/serve_hostname imagePullPolicy:Always name:kubernetes-serve-hostname resources:map[limits:map[cpu:1 memory:512Mi] requests:map[cpu:1 memory:512Mi]] terminationMessagePath:/dev/termination-log terminationMessagePolicy:File]] dnsPolicy:ClusterFirst enableServiceLinks:true priority:0 restartPolicy:Always schedulerName:default-scheduler securityContext:map[] terminationGracePeriodSeconds:30] status:map[phase:Pending qosClass:Guaranteed]]
I0712 08:46:13.041] has:map has no entry for key "missing"
W0712 08:46:13.141] error: error executing template "{{.missing}}": template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing"
I0712 08:46:14.118] Successful
I0712 08:46:14.118] message:NAME        READY   STATUS    RESTARTS   AGE
I0712 08:46:14.118] valid-pod   0/1     Pending   0          1s
I0712 08:46:14.118] STATUS      REASON          MESSAGE
I0712 08:46:14.119] Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
I0712 08:46:14.119] has:STATUS
I0712 08:46:14.120] Successful
I0712 08:46:14.120] message:NAME        READY   STATUS    RESTARTS   AGE
I0712 08:46:14.120] valid-pod   0/1     Pending   0          1s
I0712 08:46:14.120] STATUS      REASON          MESSAGE
I0712 08:46:14.120] Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
I0712 08:46:14.121] has:valid-pod
I0712 08:46:15.207] Successful
I0712 08:46:15.207] message:pod/valid-pod
I0712 08:46:15.207] has not:STATUS
I0712 08:46:15.209] Successful
I0712 08:46:15.210] message:pod/valid-pod
... skipping 144 lines ...
I0712 08:46:16.305] status:
I0712 08:46:16.305]   phase: Pending
I0712 08:46:16.305]   qosClass: Guaranteed
I0712 08:46:16.305] ---
I0712 08:46:16.305] has:name: valid-pod
I0712 08:46:16.380] Successful
I0712 08:46:16.380] message:Error from server (NotFound): pods "invalid-pod" not found
I0712 08:46:16.380] has:"invalid-pod" not found
I0712 08:46:16.461] pod "valid-pod" deleted
I0712 08:46:16.556] get.sh:200: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:46:16.705] pod/redis-master created
I0712 08:46:16.709] pod/valid-pod created
I0712 08:46:16.802] Successful
... skipping 35 lines ...
I0712 08:46:17.895] +++ command: run_kubectl_exec_pod_tests
I0712 08:46:17.908] +++ [0712 08:46:17] Creating namespace namespace-1562921177-29697
I0712 08:46:17.974] namespace/namespace-1562921177-29697 created
I0712 08:46:18.041] Context "test" modified.
I0712 08:46:18.048] +++ [0712 08:46:18] Testing kubectl exec POD COMMAND
I0712 08:46:18.129] Successful
I0712 08:46:18.129] message:Error from server (NotFound): pods "abc" not found
I0712 08:46:18.129] has:pods "abc" not found
I0712 08:46:18.281] pod/test-pod created
I0712 08:46:18.382] Successful
I0712 08:46:18.383] message:Error from server (BadRequest): pod test-pod does not have a host assigned
I0712 08:46:18.383] has not:pods "test-pod" not found
I0712 08:46:18.383] Successful
I0712 08:46:18.384] message:Error from server (BadRequest): pod test-pod does not have a host assigned
I0712 08:46:18.384] has not:pod or type/name must be specified
I0712 08:46:18.464] pod "test-pod" deleted
I0712 08:46:18.484] +++ exit code: 0
I0712 08:46:18.520] Recording: run_kubectl_exec_resource_name_tests
I0712 08:46:18.520] Running command: run_kubectl_exec_resource_name_tests
I0712 08:46:18.542] 
... skipping 2 lines ...
I0712 08:46:18.549] +++ command: run_kubectl_exec_resource_name_tests
I0712 08:46:18.563] +++ [0712 08:46:18] Creating namespace namespace-1562921178-6606
I0712 08:46:18.638] namespace/namespace-1562921178-6606 created
I0712 08:46:18.708] Context "test" modified.
I0712 08:46:18.714] +++ [0712 08:46:18] Testing kubectl exec TYPE/NAME COMMAND
I0712 08:46:18.805] Successful
I0712 08:46:18.805] message:error: the server doesn't have a resource type "foo"
I0712 08:46:18.805] has:error:
I0712 08:46:18.885] Successful
I0712 08:46:18.885] message:Error from server (NotFound): deployments.apps "bar" not found
I0712 08:46:18.886] has:"bar" not found
I0712 08:46:19.045] pod/test-pod created
I0712 08:46:19.206] replicaset.apps/frontend created
W0712 08:46:19.307] I0712 08:46:19.213091   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921178-6606", Name:"frontend", UID:"349b58be-7935-4ebe-b5a7-4bfc767dee04", APIVersion:"apps/v1", ResourceVersion:"738", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-jgc2b
W0712 08:46:19.308] I0712 08:46:19.217469   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921178-6606", Name:"frontend", UID:"349b58be-7935-4ebe-b5a7-4bfc767dee04", APIVersion:"apps/v1", ResourceVersion:"738", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-52dmq
W0712 08:46:19.308] I0712 08:46:19.218124   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921178-6606", Name:"frontend", UID:"349b58be-7935-4ebe-b5a7-4bfc767dee04", APIVersion:"apps/v1", ResourceVersion:"738", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-kskb6
I0712 08:46:19.409] configmap/test-set-env-config created
I0712 08:46:19.458] Successful
I0712 08:46:19.458] message:error: cannot attach to *v1.ConfigMap: selector for *v1.ConfigMap not implemented
I0712 08:46:19.459] has:not implemented
I0712 08:46:19.545] Successful
I0712 08:46:19.545] message:Error from server (BadRequest): pod test-pod does not have a host assigned
I0712 08:46:19.545] has not:not found
I0712 08:46:19.547] Successful
I0712 08:46:19.548] message:Error from server (BadRequest): pod test-pod does not have a host assigned
I0712 08:46:19.548] has not:pod or type/name must be specified
I0712 08:46:19.648] Successful
I0712 08:46:19.648] message:Error from server (BadRequest): pod frontend-52dmq does not have a host assigned
I0712 08:46:19.648] has not:not found
I0712 08:46:19.649] Successful
I0712 08:46:19.650] message:Error from server (BadRequest): pod frontend-52dmq does not have a host assigned
I0712 08:46:19.650] has not:pod or type/name must be specified
I0712 08:46:19.724] pod "test-pod" deleted
I0712 08:46:19.805] replicaset.apps "frontend" deleted
I0712 08:46:19.884] configmap "test-set-env-config" deleted
I0712 08:46:19.901] +++ exit code: 0
I0712 08:46:19.935] Recording: run_create_secret_tests
I0712 08:46:19.936] Running command: run_create_secret_tests
I0712 08:46:19.958] 
I0712 08:46:19.960] +++ Running case: test-cmd.run_create_secret_tests 
I0712 08:46:19.962] +++ working dir: /go/src/k8s.io/kubernetes
I0712 08:46:19.965] +++ command: run_create_secret_tests
I0712 08:46:20.059] Successful
I0712 08:46:20.059] message:Error from server (NotFound): secrets "mysecret" not found
I0712 08:46:20.059] has:secrets "mysecret" not found
I0712 08:46:20.221] Successful
I0712 08:46:20.222] message:Error from server (NotFound): secrets "mysecret" not found
I0712 08:46:20.222] has:secrets "mysecret" not found
I0712 08:46:20.223] Successful
I0712 08:46:20.223] message:user-specified
I0712 08:46:20.224] has:user-specified
I0712 08:46:20.299] Successful
I0712 08:46:20.376] {"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","selfLink":"/api/v1/namespaces/default/configmaps/tester-update-cm","uid":"8a959de3-ec42-44e9-b75f-df6cff0a4e4f","resourceVersion":"758","creationTimestamp":"2019-07-12T08:46:20Z"}}
... skipping 2 lines ...
I0712 08:46:20.555] has:uid
I0712 08:46:20.629] Successful
I0712 08:46:20.630] message:{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","selfLink":"/api/v1/namespaces/default/configmaps/tester-update-cm","uid":"8a959de3-ec42-44e9-b75f-df6cff0a4e4f","resourceVersion":"759","creationTimestamp":"2019-07-12T08:46:20Z","managedFields":[{"manager":"kubectl","operation":"Update","apiVersion":"v1","time":"2019-07-12T08:46:20Z","fields":{"f:data":{".":{},"f:key1":{}}}}]},"data":{"key1":"config1"}}
I0712 08:46:20.630] has:config1
I0712 08:46:20.697] {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Success","details":{"name":"tester-update-cm","kind":"configmaps","uid":"8a959de3-ec42-44e9-b75f-df6cff0a4e4f"}}
I0712 08:46:20.785] Successful
I0712 08:46:20.785] message:Error from server (NotFound): configmaps "tester-update-cm" not found
I0712 08:46:20.785] has:configmaps "tester-update-cm" not found
I0712 08:46:20.796] +++ exit code: 0
I0712 08:46:20.832] Recording: run_kubectl_create_kustomization_directory_tests
I0712 08:46:20.832] Running command: run_kubectl_create_kustomization_directory_tests
I0712 08:46:20.853] 
I0712 08:46:20.855] +++ Running case: test-cmd.run_kubectl_create_kustomization_directory_tests 
... skipping 158 lines ...
W0712 08:46:23.455] I0712 08:46:21.315510   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921178-6606", Name:"test-the-deployment-c44d76776", UID:"ff097173-4dd9-4b67-9995-d70ec092710e", APIVersion:"apps/v1", ResourceVersion:"768", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-the-deployment-c44d76776-s2p2b
W0712 08:46:23.455] I0712 08:46:21.317963   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921178-6606", Name:"test-the-deployment-c44d76776", UID:"ff097173-4dd9-4b67-9995-d70ec092710e", APIVersion:"apps/v1", ResourceVersion:"768", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-the-deployment-c44d76776-2f5vf
I0712 08:46:24.435] Successful
I0712 08:46:24.436] message:NAME        READY   STATUS    RESTARTS   AGE
I0712 08:46:24.436] valid-pod   0/1     Pending   0          0s
I0712 08:46:24.436] STATUS      REASON          MESSAGE
I0712 08:46:24.436] Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
I0712 08:46:24.436] has:Timeout exceeded while reading body
I0712 08:46:24.519] Successful
I0712 08:46:24.520] message:NAME        READY   STATUS    RESTARTS   AGE
I0712 08:46:24.520] valid-pod   0/1     Pending   0          1s
I0712 08:46:24.520] has:valid-pod
I0712 08:46:24.583] Successful
I0712 08:46:24.584] message:error: Invalid timeout value. Timeout must be a single integer in seconds, or an integer followed by a corresponding time unit (e.g. 1s | 2m | 3h)
I0712 08:46:24.584] has:Invalid timeout value
I0712 08:46:24.652] pod "valid-pod" deleted
I0712 08:46:24.672] +++ exit code: 0
I0712 08:46:24.706] Recording: run_crd_tests
I0712 08:46:24.707] Running command: run_crd_tests
I0712 08:46:24.732] 
... skipping 221 lines ...
I0712 08:46:28.977] foo.company.com/test patched
I0712 08:46:29.065] crd.sh:236: Successful get foos/test {{.patched}}: value1
I0712 08:46:29.144] foo.company.com/test patched
I0712 08:46:29.240] crd.sh:238: Successful get foos/test {{.patched}}: value2
I0712 08:46:29.328] foo.company.com/test patched
I0712 08:46:29.419] crd.sh:240: Successful get foos/test {{.patched}}: <no value>
I0712 08:46:29.570] +++ [0712 08:46:29] "kubectl patch --local" returns error as expected for CustomResource: error: cannot apply strategic merge patch for company.com/v1, Kind=Foo locally, try --type merge
I0712 08:46:29.633] {
I0712 08:46:29.634]     "apiVersion": "company.com/v1",
I0712 08:46:29.634]     "kind": "Foo",
I0712 08:46:29.634]     "metadata": {
I0712 08:46:29.634]         "annotations": {
I0712 08:46:29.635]             "kubernetes.io/change-cause": "kubectl patch foos/test --server=http://127.0.0.1:8080 --match-server-version=true --patch={\"patched\":null} --type=merge --record=true"
... skipping 365 lines ...
I0712 08:47:07.019] +++ working dir: /go/src/k8s.io/kubernetes
I0712 08:47:07.021] +++ command: run_cmd_with_img_tests
I0712 08:47:07.035] +++ [0712 08:47:07] Creating namespace namespace-1562921227-10218
I0712 08:47:07.107] namespace/namespace-1562921227-10218 created
I0712 08:47:07.186] Context "test" modified.
I0712 08:47:07.193] +++ [0712 08:47:07] Testing cmd with image
W0712 08:47:07.294] Error from server (NotFound): namespaces "non-native-resources" not found
W0712 08:47:07.294] kubectl run --generator=deployment/apps.v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
W0712 08:47:07.298] I0712 08:47:07.297500   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921227-10218", Name:"test1", UID:"17f77735-8899-4d35-88b0-2a62df164c3f", APIVersion:"apps/v1", ResourceVersion:"918", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test1-6d8bbd4554 to 1
W0712 08:47:07.304] I0712 08:47:07.303727   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921227-10218", Name:"test1-6d8bbd4554", UID:"62546b32-5531-4164-953f-5d9ceb1cd867", APIVersion:"apps/v1", ResourceVersion:"919", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test1-6d8bbd4554-xfd25
I0712 08:47:07.405] Successful
I0712 08:47:07.406] message:deployment.apps/test1 created
I0712 08:47:07.406] has:deployment.apps/test1 created
I0712 08:47:07.406] deployment.apps "test1" deleted
I0712 08:47:07.464] Successful
I0712 08:47:07.465] message:error: Invalid image name "InvalidImageName": invalid reference format
I0712 08:47:07.465] has:error: Invalid image name "InvalidImageName": invalid reference format
I0712 08:47:07.476] +++ exit code: 0
I0712 08:47:07.511] +++ [0712 08:47:07] Testing recursive resources
I0712 08:47:07.516] +++ [0712 08:47:07] Creating namespace namespace-1562921227-23176
I0712 08:47:07.590] namespace/namespace-1562921227-23176 created
I0712 08:47:07.662] Context "test" modified.
I0712 08:47:07.751] generic-resources.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:47:08.044] generic-resources.sh:206: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0712 08:47:08.047] Successful
I0712 08:47:08.047] message:pod/busybox0 created
I0712 08:47:08.047] pod/busybox1 created
I0712 08:47:08.047] error: error validating "hack/testdata/recursive/pod/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0712 08:47:08.047] has:error validating data: kind not set
I0712 08:47:08.145] generic-resources.sh:211: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0712 08:47:08.314] generic-resources.sh:220: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: busybox:busybox:
I0712 08:47:08.316] Successful
I0712 08:47:08.316] message:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0712 08:47:08.316] has:Object 'Kind' is missing
I0712 08:47:08.404] generic-resources.sh:227: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0712 08:47:08.648] generic-resources.sh:231: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
I0712 08:47:08.651] Successful
I0712 08:47:08.651] message:pod/busybox0 replaced
I0712 08:47:08.651] pod/busybox1 replaced
I0712 08:47:08.651] error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0712 08:47:08.651] has:error validating data: kind not set
I0712 08:47:08.743] generic-resources.sh:236: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0712 08:47:08.837] Successful
I0712 08:47:08.837] message:Name:         busybox0
I0712 08:47:08.837] Namespace:    namespace-1562921227-23176
I0712 08:47:08.838] Priority:     0
I0712 08:47:08.838] Node:         <none>
... skipping 159 lines ...
I0712 08:47:08.853] has:Object 'Kind' is missing
I0712 08:47:08.936] generic-resources.sh:246: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0712 08:47:09.124] generic-resources.sh:250: Successful get pods {{range.items}}{{.metadata.annotations.annotatekey}}:{{end}}: annotatevalue:annotatevalue:
I0712 08:47:09.126] Successful
I0712 08:47:09.127] message:pod/busybox0 annotated
I0712 08:47:09.127] pod/busybox1 annotated
I0712 08:47:09.128] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0712 08:47:09.128] has:Object 'Kind' is missing
I0712 08:47:09.222] generic-resources.sh:255: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0712 08:47:09.502] generic-resources.sh:259: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
I0712 08:47:09.504] Successful
I0712 08:47:09.505] message:Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
I0712 08:47:09.505] pod/busybox0 configured
I0712 08:47:09.505] Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
I0712 08:47:09.505] pod/busybox1 configured
I0712 08:47:09.505] error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0712 08:47:09.506] has:error validating data: kind not set
I0712 08:47:09.598] generic-resources.sh:265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:47:09.748] deployment.apps/nginx created
W0712 08:47:09.849] W0712 08:47:07.667724   48560 cacher.go:154] Terminating all watchers from cacher *unstructured.Unstructured
W0712 08:47:09.850] E0712 08:47:07.669273   51895 reflector.go:283] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:09.850] W0712 08:47:07.759704   48560 cacher.go:154] Terminating all watchers from cacher *unstructured.Unstructured
W0712 08:47:09.850] E0712 08:47:07.761101   51895 reflector.go:283] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:09.850] W0712 08:47:07.850353   48560 cacher.go:154] Terminating all watchers from cacher *unstructured.Unstructured
W0712 08:47:09.850] E0712 08:47:07.851844   51895 reflector.go:283] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:09.851] W0712 08:47:07.944808   48560 cacher.go:154] Terminating all watchers from cacher *unstructured.Unstructured
W0712 08:47:09.851] E0712 08:47:07.945863   51895 reflector.go:283] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:09.851] E0712 08:47:08.670801   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:09.851] E0712 08:47:08.762850   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:09.851] E0712 08:47:08.853072   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:09.852] E0712 08:47:08.947190   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:09.852] E0712 08:47:09.672331   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:09.852] I0712 08:47:09.753069   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921227-23176", Name:"nginx", UID:"2df2c66a-f55a-4466-997d-d63bb4da65c7", APIVersion:"apps/v1", ResourceVersion:"943", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-7bb858d887 to 3
W0712 08:47:09.853] I0712 08:47:09.759528   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921227-23176", Name:"nginx-7bb858d887", UID:"9870732e-adeb-453d-a08d-b84688dc0682", APIVersion:"apps/v1", ResourceVersion:"944", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-7bb858d887-dhttp
W0712 08:47:09.853] E0712 08:47:09.764869   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:09.853] I0712 08:47:09.765399   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921227-23176", Name:"nginx-7bb858d887", UID:"9870732e-adeb-453d-a08d-b84688dc0682", APIVersion:"apps/v1", ResourceVersion:"944", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-7bb858d887-hvwsb
W0712 08:47:09.853] I0712 08:47:09.772182   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921227-23176", Name:"nginx-7bb858d887", UID:"9870732e-adeb-453d-a08d-b84688dc0682", APIVersion:"apps/v1", ResourceVersion:"944", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-7bb858d887-qv9xk
W0712 08:47:09.855] E0712 08:47:09.854449   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:09.949] E0712 08:47:09.948738   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:10.022] kubectl convert is DEPRECATED and will be removed in a future version.
W0712 08:47:10.023] In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
I0712 08:47:10.123] generic-resources.sh:269: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx:
I0712 08:47:10.124] generic-resources.sh:270: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0712 08:47:10.124] generic-resources.sh:274: Successful get deployment nginx {{ .apiVersion }}: apps/v1
I0712 08:47:10.126] Successful
... skipping 42 lines ...
I0712 08:47:10.205] deployment.apps "nginx" deleted
I0712 08:47:10.300] generic-resources.sh:281: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0712 08:47:10.470] generic-resources.sh:285: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0712 08:47:10.472] Successful
I0712 08:47:10.473] message:kubectl convert is DEPRECATED and will be removed in a future version.
I0712 08:47:10.473] In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
I0712 08:47:10.474] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0712 08:47:10.474] has:Object 'Kind' is missing
I0712 08:47:10.564] generic-resources.sh:290: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0712 08:47:10.654] Successful
I0712 08:47:10.655] message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0712 08:47:10.655] has:busybox0:busybox1:
I0712 08:47:10.656] Successful
I0712 08:47:10.657] message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0712 08:47:10.657] has:Object 'Kind' is missing
I0712 08:47:10.742] generic-resources.sh:299: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0712 08:47:10.836] pod/busybox0 labeled
I0712 08:47:10.836] pod/busybox1 labeled
I0712 08:47:10.836] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0712 08:47:10.939] generic-resources.sh:304: Successful get pods {{range.items}}{{.metadata.labels.mylabel}}:{{end}}: myvalue:myvalue:
I0712 08:47:10.941] Successful
I0712 08:47:10.941] message:pod/busybox0 labeled
I0712 08:47:10.941] pod/busybox1 labeled
I0712 08:47:10.942] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0712 08:47:10.942] has:Object 'Kind' is missing
I0712 08:47:11.034] generic-resources.sh:309: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0712 08:47:11.123] pod/busybox0 patched
I0712 08:47:11.123] pod/busybox1 patched
I0712 08:47:11.124] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0712 08:47:11.213] generic-resources.sh:314: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: prom/busybox:prom/busybox:
I0712 08:47:11.216] Successful
I0712 08:47:11.216] message:pod/busybox0 patched
I0712 08:47:11.216] pod/busybox1 patched
I0712 08:47:11.217] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0712 08:47:11.217] has:Object 'Kind' is missing
I0712 08:47:11.316] generic-resources.sh:319: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0712 08:47:11.505] generic-resources.sh:323: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:47:11.507] Successful
I0712 08:47:11.508] message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
I0712 08:47:11.508] pod "busybox0" force deleted
I0712 08:47:11.508] pod "busybox1" force deleted
I0712 08:47:11.508] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0712 08:47:11.508] has:Object 'Kind' is missing
I0712 08:47:11.598] generic-resources.sh:328: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:47:11.753] replicationcontroller/busybox0 created
I0712 08:47:11.758] replicationcontroller/busybox1 created
I0712 08:47:11.858] generic-resources.sh:332: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0712 08:47:11.949] generic-resources.sh:337: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0712 08:47:12.035] generic-resources.sh:338: Successful get rc busybox0 {{.spec.replicas}}: 1
I0712 08:47:12.123] generic-resources.sh:339: Successful get rc busybox1 {{.spec.replicas}}: 1
I0712 08:47:12.312] generic-resources.sh:344: Successful get hpa busybox0 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
I0712 08:47:12.401] generic-resources.sh:345: Successful get hpa busybox1 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
I0712 08:47:12.404] Successful
I0712 08:47:12.404] message:horizontalpodautoscaler.autoscaling/busybox0 autoscaled
I0712 08:47:12.404] horizontalpodautoscaler.autoscaling/busybox1 autoscaled
I0712 08:47:12.405] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0712 08:47:12.405] has:Object 'Kind' is missing
I0712 08:47:12.485] horizontalpodautoscaler.autoscaling "busybox0" deleted
I0712 08:47:12.577] horizontalpodautoscaler.autoscaling "busybox1" deleted
I0712 08:47:12.677] generic-resources.sh:353: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0712 08:47:12.766] generic-resources.sh:354: Successful get rc busybox0 {{.spec.replicas}}: 1
I0712 08:47:12.857] generic-resources.sh:355: Successful get rc busybox1 {{.spec.replicas}}: 1
I0712 08:47:13.041] generic-resources.sh:359: Successful get service busybox0 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
I0712 08:47:13.130] generic-resources.sh:360: Successful get service busybox1 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
I0712 08:47:13.132] Successful
I0712 08:47:13.133] message:service/busybox0 exposed
I0712 08:47:13.133] service/busybox1 exposed
I0712 08:47:13.133] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0712 08:47:13.133] has:Object 'Kind' is missing
I0712 08:47:13.229] generic-resources.sh:366: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0712 08:47:13.312] generic-resources.sh:367: Successful get rc busybox0 {{.spec.replicas}}: 1
I0712 08:47:13.412] generic-resources.sh:368: Successful get rc busybox1 {{.spec.replicas}}: 1
I0712 08:47:13.617] generic-resources.sh:372: Successful get rc busybox0 {{.spec.replicas}}: 2
I0712 08:47:13.713] generic-resources.sh:373: Successful get rc busybox1 {{.spec.replicas}}: 2
I0712 08:47:13.715] Successful
I0712 08:47:13.716] message:replicationcontroller/busybox0 scaled
I0712 08:47:13.716] replicationcontroller/busybox1 scaled
I0712 08:47:13.716] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0712 08:47:13.716] has:Object 'Kind' is missing
I0712 08:47:13.804] generic-resources.sh:378: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0712 08:47:13.983] generic-resources.sh:382: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:47:13.985] Successful
I0712 08:47:13.986] message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
I0712 08:47:13.986] replicationcontroller "busybox0" force deleted
I0712 08:47:13.986] replicationcontroller "busybox1" force deleted
I0712 08:47:13.986] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0712 08:47:13.986] has:Object 'Kind' is missing
I0712 08:47:14.075] generic-resources.sh:387: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:47:14.223] deployment.apps/nginx1-deployment created
I0712 08:47:14.229] deployment.apps/nginx0-deployment created
W0712 08:47:14.330] E0712 08:47:10.674394   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:14.330] E0712 08:47:10.766787   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:14.330] E0712 08:47:10.855803   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:14.331] E0712 08:47:10.950185   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:14.331] E0712 08:47:11.676139   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:14.331] error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
W0712 08:47:14.331] I0712 08:47:11.758596   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921227-23176", Name:"busybox0", UID:"3a6aa217-94c0-4d2a-ac2a-45c0c624923b", APIVersion:"v1", ResourceVersion:"974", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-bl4k7
W0712 08:47:14.332] I0712 08:47:11.762229   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921227-23176", Name:"busybox1", UID:"049c7d46-3000-4b2a-838f-7e1538c1947f", APIVersion:"v1", ResourceVersion:"976", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-8wrw9
W0712 08:47:14.332] E0712 08:47:11.767859   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:14.332] E0712 08:47:11.857064   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:14.332] E0712 08:47:11.951767   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:14.333] E0712 08:47:12.677652   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:14.333] E0712 08:47:12.769278   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:14.333] E0712 08:47:12.858501   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:14.333] E0712 08:47:12.952917   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:14.334] I0712 08:47:13.519704   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921227-23176", Name:"busybox0", UID:"3a6aa217-94c0-4d2a-ac2a-45c0c624923b", APIVersion:"v1", ResourceVersion:"996", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-twl6v
W0712 08:47:14.334] I0712 08:47:13.529155   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921227-23176", Name:"busybox1", UID:"049c7d46-3000-4b2a-838f-7e1538c1947f", APIVersion:"v1", ResourceVersion:"999", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-pv9sx
W0712 08:47:14.334] E0712 08:47:13.679168   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:14.334] E0712 08:47:13.770605   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:14.335] E0712 08:47:13.859938   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:14.335] E0712 08:47:13.954742   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:14.335] error: error validating "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
W0712 08:47:14.336] I0712 08:47:14.230755   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921227-23176", Name:"nginx1-deployment", UID:"f4586886-7304-4ee3-9457-b69962a23165", APIVersion:"apps/v1", ResourceVersion:"1016", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx1-deployment-786b9bc5df to 2
W0712 08:47:14.336] I0712 08:47:14.237464   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921227-23176", Name:"nginx1-deployment-786b9bc5df", UID:"0d3607cd-890c-45f0-87c6-8ac9a8bfc7cb", APIVersion:"apps/v1", ResourceVersion:"1017", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-786b9bc5df-m8wkz
W0712 08:47:14.337] I0712 08:47:14.238918   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921227-23176", Name:"nginx0-deployment", UID:"e03ec77e-df7a-4dc0-a6cc-c41602b2f849", APIVersion:"apps/v1", ResourceVersion:"1018", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx0-deployment-8544dd566 to 2
W0712 08:47:14.337] I0712 08:47:14.243678   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921227-23176", Name:"nginx1-deployment-786b9bc5df", UID:"0d3607cd-890c-45f0-87c6-8ac9a8bfc7cb", APIVersion:"apps/v1", ResourceVersion:"1017", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-786b9bc5df-zbzbg
W0712 08:47:14.337] I0712 08:47:14.245569   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921227-23176", Name:"nginx0-deployment-8544dd566", UID:"bec11006-adbf-48cc-8206-048e79b34ae7", APIVersion:"apps/v1", ResourceVersion:"1021", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-8544dd566-kdqxf
W0712 08:47:14.338] I0712 08:47:14.249460   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921227-23176", Name:"nginx0-deployment-8544dd566", UID:"bec11006-adbf-48cc-8206-048e79b34ae7", APIVersion:"apps/v1", ResourceVersion:"1021", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-8544dd566-lk86n
I0712 08:47:14.438] generic-resources.sh:391: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx0-deployment:nginx1-deployment:
I0712 08:47:14.473] generic-resources.sh:392: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
I0712 08:47:14.673] generic-resources.sh:396: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
I0712 08:47:14.676] Successful
I0712 08:47:14.676] message:deployment.apps/nginx1-deployment skipped rollback (current template already matches revision 1)
I0712 08:47:14.677] deployment.apps/nginx0-deployment skipped rollback (current template already matches revision 1)
I0712 08:47:14.677] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I0712 08:47:14.677] has:Object 'Kind' is missing
I0712 08:47:14.769] deployment.apps/nginx1-deployment paused
I0712 08:47:14.776] deployment.apps/nginx0-deployment paused
I0712 08:47:14.881] generic-resources.sh:404: Successful get deployment {{range.items}}{{.spec.paused}}:{{end}}: true:true:
I0712 08:47:14.883] Successful
I0712 08:47:14.884] message:unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I0712 08:47:14.884] has:Object 'Kind' is missing
W0712 08:47:14.985] E0712 08:47:14.681066   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:14.985] E0712 08:47:14.773841   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:14.985] E0712 08:47:14.861649   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:14.985] E0712 08:47:14.956388   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:47:15.086] deployment.apps/nginx1-deployment resumed
I0712 08:47:15.086] deployment.apps/nginx0-deployment resumed
I0712 08:47:15.111] generic-resources.sh:410: Successful get deployment {{range.items}}{{.spec.paused}}:{{end}}: <no value>:<no value>:
I0712 08:47:15.113] Successful
I0712 08:47:15.114] message:unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I0712 08:47:15.114] has:Object 'Kind' is missing
... skipping 3 lines ...
I0712 08:47:15.226] 1         <none>
I0712 08:47:15.226] 
I0712 08:47:15.226] deployment.apps/nginx0-deployment 
I0712 08:47:15.226] REVISION  CHANGE-CAUSE
I0712 08:47:15.226] 1         <none>
I0712 08:47:15.226] 
I0712 08:47:15.227] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I0712 08:47:15.227] has:nginx0-deployment
I0712 08:47:15.227] Successful
I0712 08:47:15.227] message:deployment.apps/nginx1-deployment 
I0712 08:47:15.227] REVISION  CHANGE-CAUSE
I0712 08:47:15.228] 1         <none>
I0712 08:47:15.228] 
I0712 08:47:15.228] deployment.apps/nginx0-deployment 
I0712 08:47:15.228] REVISION  CHANGE-CAUSE
I0712 08:47:15.228] 1         <none>
I0712 08:47:15.228] 
I0712 08:47:15.228] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I0712 08:47:15.228] has:nginx1-deployment
I0712 08:47:15.230] Successful
I0712 08:47:15.230] message:deployment.apps/nginx1-deployment 
I0712 08:47:15.230] REVISION  CHANGE-CAUSE
I0712 08:47:15.230] 1         <none>
I0712 08:47:15.230] 
I0712 08:47:15.230] deployment.apps/nginx0-deployment 
I0712 08:47:15.230] REVISION  CHANGE-CAUSE
I0712 08:47:15.230] 1         <none>
I0712 08:47:15.230] 
I0712 08:47:15.231] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I0712 08:47:15.231] has:Object 'Kind' is missing
I0712 08:47:15.314] deployment.apps "nginx1-deployment" force deleted
I0712 08:47:15.320] deployment.apps "nginx0-deployment" force deleted
W0712 08:47:15.421] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0712 08:47:15.422] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
W0712 08:47:15.683] E0712 08:47:15.682929   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:15.776] E0712 08:47:15.775477   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:15.864] E0712 08:47:15.863157   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:15.958] E0712 08:47:15.957946   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:47:16.414] generic-resources.sh:426: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:47:16.562] replicationcontroller/busybox0 created
I0712 08:47:16.565] replicationcontroller/busybox1 created
I0712 08:47:16.662] generic-resources.sh:430: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0712 08:47:16.748] Successful
I0712 08:47:16.748] message:no rollbacker has been implemented for "ReplicationController"
... skipping 4 lines ...
I0712 08:47:16.750] message:no rollbacker has been implemented for "ReplicationController"
I0712 08:47:16.750] no rollbacker has been implemented for "ReplicationController"
I0712 08:47:16.751] unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0712 08:47:16.751] has:Object 'Kind' is missing
I0712 08:47:16.842] Successful
I0712 08:47:16.842] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0712 08:47:16.842] error: replicationcontrollers "busybox0" pausing is not supported
I0712 08:47:16.843] error: replicationcontrollers "busybox1" pausing is not supported
I0712 08:47:16.843] has:Object 'Kind' is missing
I0712 08:47:16.844] Successful
I0712 08:47:16.845] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0712 08:47:16.845] error: replicationcontrollers "busybox0" pausing is not supported
I0712 08:47:16.845] error: replicationcontrollers "busybox1" pausing is not supported
I0712 08:47:16.845] has:replicationcontrollers "busybox0" pausing is not supported
I0712 08:47:16.847] Successful
I0712 08:47:16.848] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0712 08:47:16.848] error: replicationcontrollers "busybox0" pausing is not supported
I0712 08:47:16.848] error: replicationcontrollers "busybox1" pausing is not supported
I0712 08:47:16.848] has:replicationcontrollers "busybox1" pausing is not supported
I0712 08:47:16.938] Successful
I0712 08:47:16.938] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0712 08:47:16.939] error: replicationcontrollers "busybox0" resuming is not supported
I0712 08:47:16.939] error: replicationcontrollers "busybox1" resuming is not supported
I0712 08:47:16.939] has:Object 'Kind' is missing
I0712 08:47:16.941] Successful
I0712 08:47:16.941] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0712 08:47:16.942] error: replicationcontrollers "busybox0" resuming is not supported
I0712 08:47:16.942] error: replicationcontrollers "busybox1" resuming is not supported
I0712 08:47:16.942] has:replicationcontrollers "busybox0" resuming is not supported
I0712 08:47:16.943] Successful
I0712 08:47:16.944] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0712 08:47:16.944] error: replicationcontrollers "busybox0" resuming is not supported
I0712 08:47:16.944] error: replicationcontrollers "busybox1" resuming is not supported
I0712 08:47:16.944] has:replicationcontrollers "busybox0" resuming is not supported
I0712 08:47:17.023] replicationcontroller "busybox0" force deleted
I0712 08:47:17.029] replicationcontroller "busybox1" force deleted
W0712 08:47:17.130] error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
W0712 08:47:17.130] I0712 08:47:16.566380   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921227-23176", Name:"busybox0", UID:"b01c8401-bf53-4fb0-a612-a6b208dee1b3", APIVersion:"v1", ResourceVersion:"1065", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-tbrgm
W0712 08:47:17.131] I0712 08:47:16.570417   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921227-23176", Name:"busybox1", UID:"380ab9a5-73d3-4f3d-bc9c-d55e1b4c5ae6", APIVersion:"v1", ResourceVersion:"1066", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-f5bfq
W0712 08:47:17.131] E0712 08:47:16.684288   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:17.131] E0712 08:47:16.776816   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:17.131] E0712 08:47:16.864668   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:17.132] E0712 08:47:16.959708   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:17.132] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0712 08:47:17.132] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
W0712 08:47:17.686] E0712 08:47:17.685961   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:17.779] E0712 08:47:17.778699   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:17.867] E0712 08:47:17.866297   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:17.962] E0712 08:47:17.962003   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:47:18.063] Recording: run_namespace_tests
I0712 08:47:18.063] Running command: run_namespace_tests
I0712 08:47:18.063] 
I0712 08:47:18.063] +++ Running case: test-cmd.run_namespace_tests 
I0712 08:47:18.064] +++ working dir: /go/src/k8s.io/kubernetes
I0712 08:47:18.067] +++ command: run_namespace_tests
I0712 08:47:18.076] +++ [0712 08:47:18] Testing kubectl(v1:namespaces)
I0712 08:47:18.147] namespace/my-namespace created
I0712 08:47:18.245] core.sh:1308: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
I0712 08:47:18.328] namespace "my-namespace" deleted
W0712 08:47:18.688] E0712 08:47:18.687739   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:18.781] E0712 08:47:18.780648   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:18.868] E0712 08:47:18.868000   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:18.964] E0712 08:47:18.963698   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:19.691] E0712 08:47:19.690579   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:19.783] E0712 08:47:19.782145   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:19.871] E0712 08:47:19.870208   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:19.966] E0712 08:47:19.965787   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:20.693] E0712 08:47:20.692369   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:20.784] E0712 08:47:20.784060   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:20.872] E0712 08:47:20.871983   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:20.968] E0712 08:47:20.967332   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:21.694] E0712 08:47:21.693870   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:21.786] E0712 08:47:21.785471   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:21.874] E0712 08:47:21.873678   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:21.970] E0712 08:47:21.969181   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:22.696] E0712 08:47:22.696012   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:22.787] E0712 08:47:22.787083   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:22.875] E0712 08:47:22.874986   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:22.971] E0712 08:47:22.970870   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:47:23.433] namespace/my-namespace condition met
I0712 08:47:23.523] Successful
I0712 08:47:23.523] message:Error from server (NotFound): namespaces "my-namespace" not found
I0712 08:47:23.523] has: not found
I0712 08:47:23.591] namespace/my-namespace created
I0712 08:47:23.686] core.sh:1317: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
I0712 08:47:23.891] Successful
I0712 08:47:23.891] message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
I0712 08:47:23.891] namespace "kube-node-lease" deleted
... skipping 29 lines ...
I0712 08:47:23.894] namespace "namespace-1562921181-8212" deleted
I0712 08:47:23.894] namespace "namespace-1562921182-24727" deleted
I0712 08:47:23.894] namespace "namespace-1562921184-2423" deleted
I0712 08:47:23.894] namespace "namespace-1562921185-4947" deleted
I0712 08:47:23.894] namespace "namespace-1562921227-10218" deleted
I0712 08:47:23.894] namespace "namespace-1562921227-23176" deleted
I0712 08:47:23.895] Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
I0712 08:47:23.895] Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
I0712 08:47:23.895] Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
I0712 08:47:23.895] has:warning: deleting cluster-scoped resources
I0712 08:47:23.895] Successful
I0712 08:47:23.895] message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
I0712 08:47:23.895] namespace "kube-node-lease" deleted
I0712 08:47:23.895] namespace "my-namespace" deleted
I0712 08:47:23.895] namespace "namespace-1562921093-15765" deleted
... skipping 27 lines ...
I0712 08:47:23.898] namespace "namespace-1562921181-8212" deleted
I0712 08:47:23.898] namespace "namespace-1562921182-24727" deleted
I0712 08:47:23.898] namespace "namespace-1562921184-2423" deleted
I0712 08:47:23.898] namespace "namespace-1562921185-4947" deleted
I0712 08:47:23.898] namespace "namespace-1562921227-10218" deleted
I0712 08:47:23.898] namespace "namespace-1562921227-23176" deleted
I0712 08:47:23.898] Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
I0712 08:47:23.898] Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
I0712 08:47:23.898] Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
I0712 08:47:23.899] has:namespace "my-namespace" deleted
I0712 08:47:23.993] core.sh:1329: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"other\" }}found{{end}}{{end}}:: :
I0712 08:47:24.061] namespace/other created
I0712 08:47:24.161] core.sh:1333: Successful get namespaces/other {{.metadata.name}}: other
I0712 08:47:24.249] core.sh:1337: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:47:24.410] pod/valid-pod created
I0712 08:47:24.508] core.sh:1341: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0712 08:47:24.599] core.sh:1343: Successful get pods -n other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0712 08:47:24.678] Successful
I0712 08:47:24.679] message:error: a resource cannot be retrieved by name across all namespaces
I0712 08:47:24.679] has:a resource cannot be retrieved by name across all namespaces
I0712 08:47:24.765] core.sh:1350: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0712 08:47:24.845] pod "valid-pod" force deleted
I0712 08:47:24.944] core.sh:1354: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:47:25.028] namespace "other" deleted
W0712 08:47:25.130] E0712 08:47:23.697400   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:25.130] E0712 08:47:23.788584   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:25.131] E0712 08:47:23.876011   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:25.131] E0712 08:47:23.972268   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:25.132] E0712 08:47:24.699063   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:25.133] E0712 08:47:24.790251   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:25.133] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0712 08:47:25.134] E0712 08:47:24.877639   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:25.134] E0712 08:47:24.973800   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:25.616] I0712 08:47:25.615757   51895 controller_utils.go:1029] Waiting for caches to sync for garbage collector controller
W0712 08:47:25.701] E0712 08:47:25.700729   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:25.710] I0712 08:47:25.710233   51895 controller_utils.go:1029] Waiting for caches to sync for resource quota controller
W0712 08:47:25.716] I0712 08:47:25.716213   51895 controller_utils.go:1036] Caches are synced for garbage collector controller
W0712 08:47:25.792] E0712 08:47:25.791878   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:25.811] I0712 08:47:25.810695   51895 controller_utils.go:1036] Caches are synced for resource quota controller
W0712 08:47:25.880] E0712 08:47:25.879658   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:25.976] E0712 08:47:25.975392   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:26.703] E0712 08:47:26.703018   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:26.794] E0712 08:47:26.793290   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:26.881] E0712 08:47:26.880800   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:26.977] E0712 08:47:26.977141   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:27.209] I0712 08:47:27.209037   51895 horizontal.go:341] Horizontal Pod Autoscaler busybox0 has been deleted in namespace-1562921227-23176
W0712 08:47:27.214] I0712 08:47:27.213736   51895 horizontal.go:341] Horizontal Pod Autoscaler busybox1 has been deleted in namespace-1562921227-23176
W0712 08:47:27.705] E0712 08:47:27.704627   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:27.795] E0712 08:47:27.794767   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:27.882] E0712 08:47:27.882266   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:27.979] E0712 08:47:27.978739   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:28.707] E0712 08:47:28.706276   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:28.796] E0712 08:47:28.796195   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:28.886] E0712 08:47:28.885529   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:28.981] E0712 08:47:28.980924   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:29.709] E0712 08:47:29.708219   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:29.799] E0712 08:47:29.798392   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:29.888] E0712 08:47:29.887310   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:29.984] E0712 08:47:29.983660   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:47:30.137] +++ exit code: 0
I0712 08:47:30.171] Recording: run_secrets_test
I0712 08:47:30.172] Running command: run_secrets_test
I0712 08:47:30.192] 
I0712 08:47:30.194] +++ Running case: test-cmd.run_secrets_test 
I0712 08:47:30.196] +++ working dir: /go/src/k8s.io/kubernetes
... skipping 58 lines ...
I0712 08:47:32.033] secret "test-secret" deleted
I0712 08:47:32.111] secret/test-secret created
I0712 08:47:32.204] core.sh:773: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
I0712 08:47:32.290] core.sh:774: Successful get secret/test-secret --namespace=test-secrets {{.type}}: kubernetes.io/tls
I0712 08:47:32.370] secret "test-secret" deleted
W0712 08:47:32.471] I0712 08:47:30.421311   68857 loader.go:375] Config loaded from file:  /tmp/tmp.K30J9Ddv31/.kube/config
W0712 08:47:32.472] E0712 08:47:30.710928   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:32.472] E0712 08:47:30.799922   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:32.473] E0712 08:47:30.888811   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:32.473] E0712 08:47:30.984928   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:32.473] E0712 08:47:31.712014   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:32.474] E0712 08:47:31.803040   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:32.474] E0712 08:47:31.889830   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:32.474] E0712 08:47:31.986073   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:47:32.575] secret/secret-string-data created
I0712 08:47:32.622] core.sh:796: Successful get secret/secret-string-data --namespace=test-secrets  {{.data}}: map[k1:djE= k2:djI=]
I0712 08:47:32.706] core.sh:797: Successful get secret/secret-string-data --namespace=test-secrets  {{.data}}: map[k1:djE= k2:djI=]
I0712 08:47:32.798] core.sh:798: Successful get secret/secret-string-data --namespace=test-secrets  {{.stringData}}: <no value>
I0712 08:47:32.874] secret "secret-string-data" deleted
I0712 08:47:32.975] core.sh:807: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:47:33.134] secret "test-secret" deleted
I0712 08:47:33.219] namespace "test-secrets" deleted
W0712 08:47:33.320] E0712 08:47:32.713904   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:33.321] E0712 08:47:32.804530   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:33.321] E0712 08:47:32.891176   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:33.321] E0712 08:47:32.987745   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:33.716] E0712 08:47:33.715622   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:33.806] E0712 08:47:33.806073   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:33.893] E0712 08:47:33.892556   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:33.990] E0712 08:47:33.989433   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:34.718] E0712 08:47:34.717381   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:34.808] E0712 08:47:34.807728   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:34.895] E0712 08:47:34.894273   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:34.991] E0712 08:47:34.991049   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:35.720] E0712 08:47:35.719355   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:35.810] E0712 08:47:35.809356   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:35.896] E0712 08:47:35.895951   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:35.993] E0712 08:47:35.992756   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:36.721] E0712 08:47:36.720762   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:36.811] E0712 08:47:36.811019   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:36.898] E0712 08:47:36.897390   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:36.994] E0712 08:47:36.994204   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:37.723] E0712 08:47:37.722467   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:37.814] E0712 08:47:37.813287   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:37.899] E0712 08:47:37.899188   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:37.997] E0712 08:47:37.996284   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:47:38.338] +++ exit code: 0
I0712 08:47:38.371] Recording: run_configmap_tests
I0712 08:47:38.371] Running command: run_configmap_tests
I0712 08:47:38.393] 
I0712 08:47:38.394] +++ Running case: test-cmd.run_configmap_tests 
I0712 08:47:38.396] +++ working dir: /go/src/k8s.io/kubernetes
... skipping 14 lines ...
I0712 08:47:39.432] configmap/test-binary-configmap created
I0712 08:47:39.516] core.sh:48: Successful get configmap/test-configmap --namespace=test-configmaps {{.metadata.name}}: test-configmap
I0712 08:47:39.604] core.sh:49: Successful get configmap/test-binary-configmap --namespace=test-configmaps {{.metadata.name}}: test-binary-configmap
I0712 08:47:39.836] configmap "test-configmap" deleted
I0712 08:47:39.914] configmap "test-binary-configmap" deleted
I0712 08:47:39.993] namespace "test-configmaps" deleted
W0712 08:47:40.094] E0712 08:47:38.723844   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:40.095] E0712 08:47:38.814351   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:40.095] E0712 08:47:38.900351   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:40.095] E0712 08:47:38.997722   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:40.096] E0712 08:47:39.725604   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:40.096] E0712 08:47:39.815873   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:40.096] E0712 08:47:39.902122   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:40.096] E0712 08:47:39.999024   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:40.727] E0712 08:47:40.726962   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:40.818] E0712 08:47:40.817468   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:40.904] E0712 08:47:40.903487   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:41.001] E0712 08:47:41.000589   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:41.729] E0712 08:47:41.728822   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:41.819] E0712 08:47:41.818939   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:41.906] E0712 08:47:41.905259   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:42.002] E0712 08:47:42.002148   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:42.731] E0712 08:47:42.730548   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:42.821] E0712 08:47:42.820711   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:42.907] E0712 08:47:42.906860   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:43.004] E0712 08:47:43.003863   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:43.732] E0712 08:47:43.731928   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:43.823] E0712 08:47:43.822334   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:43.909] E0712 08:47:43.908707   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:44.006] E0712 08:47:44.005656   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:44.734] E0712 08:47:44.733590   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:44.824] E0712 08:47:44.823978   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:44.911] E0712 08:47:44.910397   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:45.007] E0712 08:47:45.007002   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:47:45.108] +++ exit code: 0
I0712 08:47:45.136] Recording: run_client_config_tests
I0712 08:47:45.137] Running command: run_client_config_tests
I0712 08:47:45.159] 
I0712 08:47:45.161] +++ Running case: test-cmd.run_client_config_tests 
I0712 08:47:45.163] +++ working dir: /go/src/k8s.io/kubernetes
I0712 08:47:45.166] +++ command: run_client_config_tests
I0712 08:47:45.179] +++ [0712 08:47:45] Creating namespace namespace-1562921265-25302
I0712 08:47:45.253] namespace/namespace-1562921265-25302 created
I0712 08:47:45.322] Context "test" modified.
I0712 08:47:45.329] +++ [0712 08:47:45] Testing client config
I0712 08:47:45.399] Successful
I0712 08:47:45.399] message:error: stat missing: no such file or directory
I0712 08:47:45.399] has:missing: no such file or directory
I0712 08:47:45.473] Successful
I0712 08:47:45.473] message:error: stat missing: no such file or directory
I0712 08:47:45.473] has:missing: no such file or directory
I0712 08:47:45.539] Successful
I0712 08:47:45.539] message:error: stat missing: no such file or directory
I0712 08:47:45.540] has:missing: no such file or directory
I0712 08:47:45.613] Successful
I0712 08:47:45.613] message:Error in configuration: context was not found for specified context: missing-context
I0712 08:47:45.613] has:context was not found for specified context: missing-context
I0712 08:47:45.685] Successful
I0712 08:47:45.685] message:error: no server found for cluster "missing-cluster"
I0712 08:47:45.686] has:no server found for cluster "missing-cluster"
I0712 08:47:45.754] Successful
I0712 08:47:45.755] message:error: auth info "missing-user" does not exist
I0712 08:47:45.755] has:auth info "missing-user" does not exist
W0712 08:47:45.856] E0712 08:47:45.735317   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:45.857] E0712 08:47:45.825454   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:45.912] E0712 08:47:45.911952   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:46.009] E0712 08:47:46.008555   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:47:46.109] Successful
I0712 08:47:46.110] message:error: error loading config file "/tmp/newconfig.yaml": no kind "Config" is registered for version "v-1" in scheme "k8s.io/client-go/tools/clientcmd/api/latest/latest.go:50"
I0712 08:47:46.110] has:error loading config file
I0712 08:47:46.110] Successful
I0712 08:47:46.110] message:error: stat missing-config: no such file or directory
I0712 08:47:46.110] has:no such file or directory
I0712 08:47:46.111] +++ exit code: 0
I0712 08:47:46.111] Recording: run_service_accounts_tests
I0712 08:47:46.111] Running command: run_service_accounts_tests
I0712 08:47:46.111] 
I0712 08:47:46.111] +++ Running case: test-cmd.run_service_accounts_tests 
... skipping 7 lines ...
I0712 08:47:46.354] namespace/test-service-accounts created
I0712 08:47:46.445] core.sh:832: Successful get namespaces/test-service-accounts {{.metadata.name}}: test-service-accounts
I0712 08:47:46.515] serviceaccount/test-service-account created
I0712 08:47:46.605] core.sh:838: Successful get serviceaccount/test-service-account --namespace=test-service-accounts {{.metadata.name}}: test-service-account
I0712 08:47:46.680] serviceaccount "test-service-account" deleted
I0712 08:47:46.757] namespace "test-service-accounts" deleted
W0712 08:47:46.858] E0712 08:47:46.736829   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:46.858] E0712 08:47:46.827211   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:46.914] E0712 08:47:46.913531   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:47.011] E0712 08:47:47.010219   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:47.739] E0712 08:47:47.738442   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:47.829] E0712 08:47:47.829039   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:47.916] E0712 08:47:47.915546   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:48.012] E0712 08:47:48.011845   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:48.740] E0712 08:47:48.740123   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:48.831] E0712 08:47:48.830722   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:48.917] E0712 08:47:48.917119   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:49.014] E0712 08:47:49.013427   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:49.742] E0712 08:47:49.741665   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:49.833] E0712 08:47:49.832284   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:49.919] E0712 08:47:49.918528   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:50.015] E0712 08:47:50.015106   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:50.744] E0712 08:47:50.743175   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:50.834] E0712 08:47:50.833988   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:50.920] E0712 08:47:50.920132   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:51.017] E0712 08:47:51.016925   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:51.745] E0712 08:47:51.744613   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:51.835] E0712 08:47:51.834764   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:51.921] E0712 08:47:51.921415   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:52.019] E0712 08:47:52.018522   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:47:52.120] +++ exit code: 0
I0712 08:47:52.120] Recording: run_job_tests
I0712 08:47:52.121] Running command: run_job_tests
I0712 08:47:52.121] 
I0712 08:47:52.121] +++ Running case: test-cmd.run_job_tests 
I0712 08:47:52.121] +++ working dir: /go/src/k8s.io/kubernetes
... skipping 14 lines ...
I0712 08:47:52.660] Labels:                        run=pi
I0712 08:47:52.660] Annotations:                   <none>
I0712 08:47:52.660] Schedule:                      59 23 31 2 *
I0712 08:47:52.661] Concurrency Policy:            Allow
I0712 08:47:52.661] Suspend:                       False
I0712 08:47:52.661] Successful Job History Limit:  3
I0712 08:47:52.661] Failed Job History Limit:      1
I0712 08:47:52.661] Starting Deadline Seconds:     <unset>
I0712 08:47:52.661] Selector:                      <unset>
I0712 08:47:52.661] Parallelism:                   <unset>
I0712 08:47:52.661] Completions:                   <unset>
I0712 08:47:52.661] Pod Template:
I0712 08:47:52.661]   Labels:  run=pi
... skipping 32 lines ...
I0712 08:47:53.178]                 run=pi
I0712 08:47:53.178] Annotations:    cronjob.kubernetes.io/instantiate: manual
I0712 08:47:53.178] Controlled By:  CronJob/pi
I0712 08:47:53.178] Parallelism:    1
I0712 08:47:53.178] Completions:    1
I0712 08:47:53.179] Start Time:     Fri, 12 Jul 2019 08:47:52 +0000
I0712 08:47:53.179] Pods Statuses:  1 Running / 0 Succeeded / 0 Failed
I0712 08:47:53.179] Pod Template:
I0712 08:47:53.179]   Labels:  controller-uid=dc397d39-dcab-40ea-a34c-5ccc366f5bcc
I0712 08:47:53.179]            job-name=test-job
I0712 08:47:53.179]            run=pi
I0712 08:47:53.179]   Containers:
I0712 08:47:53.179]    pi:
... skipping 16 lines ...
I0712 08:47:53.180]   ----    ------            ----  ----            -------
I0712 08:47:53.180]   Normal  SuccessfulCreate  1s    job-controller  Created pod: test-job-scqc8
I0712 08:47:53.263] job.batch "test-job" deleted
I0712 08:47:53.345] cronjob.batch "pi" deleted
I0712 08:47:53.426] namespace "test-jobs" deleted
W0712 08:47:53.527] kubectl run --generator=cronjob/v1beta1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
W0712 08:47:53.528] E0712 08:47:52.746040   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:53.528] E0712 08:47:52.836747   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:53.529] I0712 08:47:52.921036   51895 event.go:255] Event(v1.ObjectReference{Kind:"Job", Namespace:"test-jobs", Name:"test-job", UID:"dc397d39-dcab-40ea-a34c-5ccc366f5bcc", APIVersion:"batch/v1", ResourceVersion:"1346", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-job-scqc8
W0712 08:47:53.529] E0712 08:47:52.924442   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:53.529] I0712 08:47:52.932954   51895 event.go:255] Event(v1.ObjectReference{Kind:"CronJob", Namespace:"test-jobs", Name:"pi", UID:"9f7f787d-8b92-45ab-ba7d-7a8d42fc0477", APIVersion:"batch/v1beta1", ResourceVersion:"1344", FieldPath:""}): type: 'Warning' reason: 'UnexpectedJob' Saw a job that the controller did not create or forgot: test-job
W0712 08:47:53.530] E0712 08:47:52.940668   51895 cronjob_controller.go:272] Cannot determine if test-jobs/pi needs to be started: too many missed start time (> 100). Set or decrease .spec.startingDeadlineSeconds or check clock skew
W0712 08:47:53.530] I0712 08:47:52.940743   51895 event.go:255] Event(v1.ObjectReference{Kind:"CronJob", Namespace:"test-jobs", Name:"pi", UID:"9f7f787d-8b92-45ab-ba7d-7a8d42fc0477", APIVersion:"batch/v1beta1", ResourceVersion:"1344", FieldPath:""}): type: 'Warning' reason: 'FailedNeedsStart' Cannot determine if job needs to be started: too many missed start time (> 100). Set or decrease .spec.startingDeadlineSeconds or check clock skew
W0712 08:47:53.530] E0712 08:47:53.019921   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:53.748] E0712 08:47:53.747831   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:53.838] E0712 08:47:53.838128   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:53.927] E0712 08:47:53.926294   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:54.022] E0712 08:47:54.021395   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:54.750] E0712 08:47:54.749429   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:54.840] E0712 08:47:54.839561   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:54.928] E0712 08:47:54.927747   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:55.023] E0712 08:47:55.022863   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:55.751] E0712 08:47:55.751151   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:55.841] E0712 08:47:55.841045   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:55.929] E0712 08:47:55.929205   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:56.025] E0712 08:47:56.024580   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:56.753] E0712 08:47:56.752932   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:56.843] E0712 08:47:56.842526   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:56.931] E0712 08:47:56.931110   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:57.027] E0712 08:47:57.026240   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:57.755] E0712 08:47:57.754585   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:57.845] E0712 08:47:57.844135   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:57.933] E0712 08:47:57.932647   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:47:58.028] E0712 08:47:58.027789   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:47:58.541] +++ exit code: 0
I0712 08:47:58.577] Recording: run_create_job_tests
I0712 08:47:58.578] Running command: run_create_job_tests
I0712 08:47:58.599] 
I0712 08:47:58.602] +++ Running case: test-cmd.run_create_job_tests 
I0712 08:47:58.604] +++ working dir: /go/src/k8s.io/kubernetes
... skipping 27 lines ...
I0712 08:47:59.899] +++ [0712 08:47:59] Testing pod templates
I0712 08:47:59.978] core.sh:1415: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:48:00.120] podtemplate/nginx created
I0712 08:48:00.215] core.sh:1419: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: nginx:
I0712 08:48:00.287] NAME    CONTAINERS   IMAGES   POD LABELS
I0712 08:48:00.288] nginx   nginx        nginx    name=nginx
W0712 08:48:00.388] E0712 08:47:58.755824   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:00.389] I0712 08:47:58.843846   51895 event.go:255] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1562921278-31879", Name:"test-job", UID:"26dc52d7-7f86-4f3a-8a64-d4e2767284ca", APIVersion:"batch/v1", ResourceVersion:"1367", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-job-7sj4b
W0712 08:48:00.390] E0712 08:47:58.845692   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:00.390] E0712 08:47:58.933779   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:00.390] E0712 08:47:59.028871   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:00.391] I0712 08:47:59.072684   51895 event.go:255] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1562921278-31879", Name:"test-job-pi", UID:"6719f143-2f69-4556-9d06-22e2634ad4af", APIVersion:"batch/v1", ResourceVersion:"1375", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-job-pi-vff5j
W0712 08:48:00.391] kubectl run --generator=cronjob/v1beta1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
W0712 08:48:00.392] I0712 08:47:59.423991   51895 event.go:255] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1562921278-31879", Name:"my-pi", UID:"3a2aff90-8a1c-4be1-809e-17d148c38e60", APIVersion:"batch/v1", ResourceVersion:"1383", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: my-pi-ktq6s
W0712 08:48:00.392] E0712 08:47:59.757127   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:00.392] E0712 08:47:59.846588   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:00.393] E0712 08:47:59.935185   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:00.393] E0712 08:48:00.030284   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:00.393] I0712 08:48:00.117251   48560 controller.go:606] quota admission added evaluator for: podtemplates
I0712 08:48:00.494] core.sh:1427: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: nginx:
I0712 08:48:00.541] podtemplate "nginx" deleted
I0712 08:48:00.633] core.sh:1431: Successful get podtemplate {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:48:00.644] +++ exit code: 0
I0712 08:48:00.681] Recording: run_service_tests
... skipping 66 lines ...
I0712 08:48:01.530] Port:              <unset>  6379/TCP
I0712 08:48:01.530] TargetPort:        6379/TCP
I0712 08:48:01.530] Endpoints:         <none>
I0712 08:48:01.530] Session Affinity:  None
I0712 08:48:01.531] Events:            <none>
I0712 08:48:01.531] 
W0712 08:48:01.631] E0712 08:48:00.759029   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:01.632] E0712 08:48:00.847685   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:01.632] E0712 08:48:00.936489   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:01.632] E0712 08:48:01.031133   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:48:01.733] Successful describe services:
I0712 08:48:01.733] Name:              kubernetes
I0712 08:48:01.734] Namespace:         default
I0712 08:48:01.734] Labels:            component=apiserver
I0712 08:48:01.734]                    provider=kubernetes
I0712 08:48:01.734] Annotations:       <none>
... skipping 238 lines ...
I0712 08:48:02.535]   selector:
I0712 08:48:02.535]     role: padawan
I0712 08:48:02.535]   sessionAffinity: None
I0712 08:48:02.535]   type: ClusterIP
I0712 08:48:02.535] status:
I0712 08:48:02.535]   loadBalancer: {}
W0712 08:48:02.636] E0712 08:48:01.760975   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:02.636] E0712 08:48:01.849010   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:02.637] E0712 08:48:01.938404   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:02.637] E0712 08:48:02.032472   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:02.637] error: you must specify resources by --filename when --local is set.
W0712 08:48:02.638] Example resource specifications include:
W0712 08:48:02.638]    '-f rsrc.yaml'
W0712 08:48:02.638]    '--filename=rsrc.json'
I0712 08:48:02.738] core.sh:898: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend:
I0712 08:48:02.827] core.sh:905: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
I0712 08:48:02.893] service "redis-master" deleted
... skipping 8 lines ...
I0712 08:48:03.916] core.sh:952: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:service-v1-test:
I0712 08:48:03.999] service "redis-master" deleted
I0712 08:48:04.080] service "service-v1-test" deleted
I0712 08:48:04.184] core.sh:960: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
I0712 08:48:04.270] core.sh:964: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
I0712 08:48:04.416] service/redis-master created
W0712 08:48:04.517] E0712 08:48:02.762488   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:04.517] E0712 08:48:02.850108   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:04.517] E0712 08:48:02.940124   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:04.518] E0712 08:48:03.034043   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:04.518] E0712 08:48:03.764131   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:04.518] E0712 08:48:03.851207   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:04.518] E0712 08:48:03.941632   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:04.519] E0712 08:48:04.035464   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:48:04.619] service/redis-slave created
I0712 08:48:04.657] core.sh:969: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:redis-slave:
I0712 08:48:04.733] Successful
I0712 08:48:04.733] message:NAME           RSRC
I0712 08:48:04.733] kubernetes     144
I0712 08:48:04.734] redis-master   1417
... skipping 84 lines ...
I0712 08:48:09.363] apps.sh:84: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0712 08:48:09.447] apps.sh:85: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
I0712 08:48:09.538] daemonset.apps/bind rolled back
I0712 08:48:09.624] apps.sh:88: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
I0712 08:48:09.702] apps.sh:89: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I0712 08:48:09.794] Successful
I0712 08:48:09.794] message:error: unable to find specified revision 1000000 in history
I0712 08:48:09.795] has:unable to find specified revision
I0712 08:48:09.880] apps.sh:93: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
I0712 08:48:09.965] apps.sh:94: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I0712 08:48:10.056] daemonset.apps/bind rolled back
I0712 08:48:10.153] apps.sh:97: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
I0712 08:48:10.240] apps.sh:98: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
... skipping 13 lines ...
I0712 08:48:10.731] core.sh:1046: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:48:10.874] replicationcontroller/frontend created
I0712 08:48:10.953] replicationcontroller "frontend" deleted
I0712 08:48:11.048] core.sh:1051: Successful get pods -l "name=frontend" {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:48:11.135] core.sh:1055: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:48:11.282] replicationcontroller/frontend created
W0712 08:48:11.383] E0712 08:48:04.765610   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:11.383] E0712 08:48:04.852787   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:11.384] E0712 08:48:04.942840   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:11.384] E0712 08:48:05.037041   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:11.384] kubectl run --generator=deployment/apps.v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
W0712 08:48:11.385] I0712 08:48:05.690171   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"default", Name:"testmetadata", UID:"2daf9a35-5844-4f12-b147-f5bfd7408313", APIVersion:"apps/v1", ResourceVersion:"1433", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set testmetadata-6768bb66d4 to 2
W0712 08:48:11.385] I0712 08:48:05.696696   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"default", Name:"testmetadata-6768bb66d4", UID:"37cad156-f288-44b9-8825-b5dd3400c3a6", APIVersion:"apps/v1", ResourceVersion:"1434", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: testmetadata-6768bb66d4-qp48x
W0712 08:48:11.386] I0712 08:48:05.700118   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"default", Name:"testmetadata-6768bb66d4", UID:"37cad156-f288-44b9-8825-b5dd3400c3a6", APIVersion:"apps/v1", ResourceVersion:"1434", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: testmetadata-6768bb66d4-kslgs
W0712 08:48:11.386] E0712 08:48:05.767066   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:11.386] E0712 08:48:05.854396   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:11.387] E0712 08:48:05.944304   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:11.387] E0712 08:48:06.038651   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:11.387] I0712 08:48:06.688022   48560 controller.go:606] quota admission added evaluator for: daemonsets.apps
W0712 08:48:11.388] I0712 08:48:06.700205   48560 controller.go:606] quota admission added evaluator for: controllerrevisions.apps
W0712 08:48:11.388] E0712 08:48:06.768536   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:11.388] E0712 08:48:06.855824   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:11.389] E0712 08:48:06.946118   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:11.389] E0712 08:48:07.039938   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:11.389] E0712 08:48:07.769861   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:11.390] E0712 08:48:07.857258   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:11.390] E0712 08:48:07.947597   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:11.390] E0712 08:48:08.041353   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:11.390] E0712 08:48:08.770829   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:11.390] E0712 08:48:08.858243   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:11.391] E0712 08:48:08.948652   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:11.391] E0712 08:48:09.042648   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:11.391] E0712 08:48:09.772200   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:11.391] E0712 08:48:09.859471   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:11.391] E0712 08:48:09.949784   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:11.392] E0712 08:48:10.043725   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:11.392] E0712 08:48:10.773499   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:11.392] E0712 08:48:10.860620   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:11.392] I0712 08:48:10.879931   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921290-25107", Name:"frontend", UID:"f07efe96-11ee-4382-aa6e-2a9f8278bf5e", APIVersion:"v1", ResourceVersion:"1509", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-96gms
W0712 08:48:11.393] I0712 08:48:10.884083   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921290-25107", Name:"frontend", UID:"f07efe96-11ee-4382-aa6e-2a9f8278bf5e", APIVersion:"v1", ResourceVersion:"1509", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-85bhm
W0712 08:48:11.393] I0712 08:48:10.884453   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921290-25107", Name:"frontend", UID:"f07efe96-11ee-4382-aa6e-2a9f8278bf5e", APIVersion:"v1", ResourceVersion:"1509", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-6nvvs
W0712 08:48:11.393] E0712 08:48:10.950837   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:11.393] E0712 08:48:11.045070   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:11.394] I0712 08:48:11.292550   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921290-25107", Name:"frontend", UID:"88a8ded3-b5f5-400d-9db3-536d2e6b6e32", APIVersion:"v1", ResourceVersion:"1526", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-kqjfh
W0712 08:48:11.394] I0712 08:48:11.306788   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921290-25107", Name:"frontend", UID:"88a8ded3-b5f5-400d-9db3-536d2e6b6e32", APIVersion:"v1", ResourceVersion:"1526", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-k4rlt
W0712 08:48:11.394] I0712 08:48:11.307098   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921290-25107", Name:"frontend", UID:"88a8ded3-b5f5-400d-9db3-536d2e6b6e32", APIVersion:"v1", ResourceVersion:"1526", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-6bkdf
I0712 08:48:11.495] core.sh:1059: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:
I0712 08:48:11.528] core.sh:1061: Successful describe rc frontend:
I0712 08:48:11.529] Name:         frontend
I0712 08:48:11.529] Namespace:    namespace-1562921290-25107
I0712 08:48:11.529] Selector:     app=guestbook,tier=frontend
I0712 08:48:11.529] Labels:       app=guestbook
I0712 08:48:11.529]               tier=frontend
I0712 08:48:11.530] Annotations:  <none>
I0712 08:48:11.530] Replicas:     3 current / 3 desired
I0712 08:48:11.530] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0712 08:48:11.530] Pod Template:
I0712 08:48:11.530]   Labels:  app=guestbook
I0712 08:48:11.530]            tier=frontend
I0712 08:48:11.530]   Containers:
I0712 08:48:11.530]    php-redis:
I0712 08:48:11.530]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
I0712 08:48:11.629] Namespace:    namespace-1562921290-25107
I0712 08:48:11.630] Selector:     app=guestbook,tier=frontend
I0712 08:48:11.630] Labels:       app=guestbook
I0712 08:48:11.630]               tier=frontend
I0712 08:48:11.630] Annotations:  <none>
I0712 08:48:11.630] Replicas:     3 current / 3 desired
I0712 08:48:11.630] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0712 08:48:11.630] Pod Template:
I0712 08:48:11.630]   Labels:  app=guestbook
I0712 08:48:11.630]            tier=frontend
I0712 08:48:11.630]   Containers:
I0712 08:48:11.630]    php-redis:
I0712 08:48:11.630]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 18 lines ...
I0712 08:48:11.725] Namespace:    namespace-1562921290-25107
I0712 08:48:11.725] Selector:     app=guestbook,tier=frontend
I0712 08:48:11.725] Labels:       app=guestbook
I0712 08:48:11.725]               tier=frontend
I0712 08:48:11.725] Annotations:  <none>
I0712 08:48:11.725] Replicas:     3 current / 3 desired
I0712 08:48:11.726] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0712 08:48:11.726] Pod Template:
I0712 08:48:11.726]   Labels:  app=guestbook
I0712 08:48:11.726]            tier=frontend
I0712 08:48:11.726]   Containers:
I0712 08:48:11.726]    php-redis:
I0712 08:48:11.726]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 12 lines ...
I0712 08:48:11.832] Namespace:    namespace-1562921290-25107
I0712 08:48:11.832] Selector:     app=guestbook,tier=frontend
I0712 08:48:11.832] Labels:       app=guestbook
I0712 08:48:11.833]               tier=frontend
I0712 08:48:11.833] Annotations:  <none>
I0712 08:48:11.833] Replicas:     3 current / 3 desired
I0712 08:48:11.833] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0712 08:48:11.833] Pod Template:
I0712 08:48:11.833]   Labels:  app=guestbook
I0712 08:48:11.833]            tier=frontend
I0712 08:48:11.834]   Containers:
I0712 08:48:11.834]    php-redis:
I0712 08:48:11.834]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 10 lines ...
I0712 08:48:11.835]   Type    Reason            Age   From                    Message
I0712 08:48:11.835]   ----    ------            ----  ----                    -------
I0712 08:48:11.835]   Normal  SuccessfulCreate  0s    replication-controller  Created pod: frontend-kqjfh
I0712 08:48:11.836]   Normal  SuccessfulCreate  0s    replication-controller  Created pod: frontend-k4rlt
I0712 08:48:11.836]   Normal  SuccessfulCreate  0s    replication-controller  Created pod: frontend-6bkdf
I0712 08:48:11.836] 
W0712 08:48:11.937] E0712 08:48:11.774733   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:11.937] E0712 08:48:11.862060   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:11.952] E0712 08:48:11.952292   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:12.048] E0712 08:48:12.047870   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:48:12.149] Successful describe rc:
I0712 08:48:12.150] Name:         frontend
I0712 08:48:12.150] Namespace:    namespace-1562921290-25107
I0712 08:48:12.150] Selector:     app=guestbook,tier=frontend
I0712 08:48:12.150] Labels:       app=guestbook
I0712 08:48:12.150]               tier=frontend
I0712 08:48:12.150] Annotations:  <none>
I0712 08:48:12.150] Replicas:     3 current / 3 desired
I0712 08:48:12.151] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0712 08:48:12.151] Pod Template:
I0712 08:48:12.151]   Labels:  app=guestbook
I0712 08:48:12.151]            tier=frontend
I0712 08:48:12.151]   Containers:
I0712 08:48:12.151]    php-redis:
I0712 08:48:12.151]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
I0712 08:48:12.153] Namespace:    namespace-1562921290-25107
I0712 08:48:12.153] Selector:     app=guestbook,tier=frontend
I0712 08:48:12.153] Labels:       app=guestbook
I0712 08:48:12.153]               tier=frontend
I0712 08:48:12.153] Annotations:  <none>
I0712 08:48:12.153] Replicas:     3 current / 3 desired
I0712 08:48:12.153] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0712 08:48:12.153] Pod Template:
I0712 08:48:12.153]   Labels:  app=guestbook
I0712 08:48:12.153]            tier=frontend
I0712 08:48:12.153]   Containers:
I0712 08:48:12.153]    php-redis:
I0712 08:48:12.154]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
I0712 08:48:12.179] Namespace:    namespace-1562921290-25107
I0712 08:48:12.180] Selector:     app=guestbook,tier=frontend
I0712 08:48:12.180] Labels:       app=guestbook
I0712 08:48:12.180]               tier=frontend
I0712 08:48:12.180] Annotations:  <none>
I0712 08:48:12.180] Replicas:     3 current / 3 desired
I0712 08:48:12.180] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0712 08:48:12.181] Pod Template:
I0712 08:48:12.181]   Labels:  app=guestbook
I0712 08:48:12.181]            tier=frontend
I0712 08:48:12.181]   Containers:
I0712 08:48:12.181]    php-redis:
I0712 08:48:12.181]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 11 lines ...
I0712 08:48:12.281] Namespace:    namespace-1562921290-25107
I0712 08:48:12.282] Selector:     app=guestbook,tier=frontend
I0712 08:48:12.282] Labels:       app=guestbook
I0712 08:48:12.282]               tier=frontend
I0712 08:48:12.282] Annotations:  <none>
I0712 08:48:12.282] Replicas:     3 current / 3 desired
I0712 08:48:12.282] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0712 08:48:12.282] Pod Template:
I0712 08:48:12.282]   Labels:  app=guestbook
I0712 08:48:12.282]            tier=frontend
I0712 08:48:12.282]   Containers:
I0712 08:48:12.282]    php-redis:
I0712 08:48:12.282]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 22 lines ...
I0712 08:48:13.065] core.sh:1099: Successful get rc frontend {{.spec.replicas}}: 3
I0712 08:48:13.151] core.sh:1103: Successful get rc frontend {{.spec.replicas}}: 3
I0712 08:48:13.227] replicationcontroller/frontend scaled
I0712 08:48:13.325] core.sh:1107: Successful get rc frontend {{.spec.replicas}}: 2
I0712 08:48:13.405] replicationcontroller "frontend" deleted
W0712 08:48:13.505] I0712 08:48:12.454499   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921290-25107", Name:"frontend", UID:"88a8ded3-b5f5-400d-9db3-536d2e6b6e32", APIVersion:"v1", ResourceVersion:"1535", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-6bkdf
W0712 08:48:13.506] error: Expected replicas to be 3, was 2
W0712 08:48:13.506] E0712 08:48:12.776179   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:13.506] E0712 08:48:12.863191   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:13.507] E0712 08:48:12.953523   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:13.507] I0712 08:48:12.974992   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921290-25107", Name:"frontend", UID:"88a8ded3-b5f5-400d-9db3-536d2e6b6e32", APIVersion:"v1", ResourceVersion:"1542", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-wxbs9
W0712 08:48:13.507] E0712 08:48:13.048995   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:13.507] I0712 08:48:13.236185   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921290-25107", Name:"frontend", UID:"88a8ded3-b5f5-400d-9db3-536d2e6b6e32", APIVersion:"v1", ResourceVersion:"1548", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-wxbs9
W0712 08:48:13.575] I0712 08:48:13.574380   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921290-25107", Name:"redis-master", UID:"3affe900-8e24-4b1d-ba62-76bf4d009bdf", APIVersion:"v1", ResourceVersion:"1559", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-fmsgl
I0712 08:48:13.675] replicationcontroller/redis-master created
I0712 08:48:13.738] replicationcontroller/redis-slave created
W0712 08:48:13.839] I0712 08:48:13.744345   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921290-25107", Name:"redis-slave", UID:"cbecd2ae-c0ac-4f0c-9d79-0d03fdbb3cf9", APIVersion:"v1", ResourceVersion:"1564", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-nwhrv
W0712 08:48:13.839] I0712 08:48:13.750362   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921290-25107", Name:"redis-slave", UID:"cbecd2ae-c0ac-4f0c-9d79-0d03fdbb3cf9", APIVersion:"v1", ResourceVersion:"1564", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-gg7fx
W0712 08:48:13.839] E0712 08:48:13.778035   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:13.848] I0712 08:48:13.848236   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921290-25107", Name:"redis-master", UID:"3affe900-8e24-4b1d-ba62-76bf4d009bdf", APIVersion:"v1", ResourceVersion:"1571", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-7qcmp
W0712 08:48:13.853] I0712 08:48:13.852943   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921290-25107", Name:"redis-master", UID:"3affe900-8e24-4b1d-ba62-76bf4d009bdf", APIVersion:"v1", ResourceVersion:"1571", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-frf8k
W0712 08:48:13.854] I0712 08:48:13.853456   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921290-25107", Name:"redis-master", UID:"3affe900-8e24-4b1d-ba62-76bf4d009bdf", APIVersion:"v1", ResourceVersion:"1571", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-6vxxc
W0712 08:48:13.856] I0712 08:48:13.856392   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921290-25107", Name:"redis-slave", UID:"cbecd2ae-c0ac-4f0c-9d79-0d03fdbb3cf9", APIVersion:"v1", ResourceVersion:"1573", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-7rf2h
W0712 08:48:13.864] I0712 08:48:13.863461   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921290-25107", Name:"redis-slave", UID:"cbecd2ae-c0ac-4f0c-9d79-0d03fdbb3cf9", APIVersion:"v1", ResourceVersion:"1573", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-p8jzn
W0712 08:48:13.865] E0712 08:48:13.864709   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:13.955] E0712 08:48:13.955162   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:14.051] E0712 08:48:14.050613   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:48:14.152] replicationcontroller/redis-master scaled
I0712 08:48:14.152] replicationcontroller/redis-slave scaled
I0712 08:48:14.153] core.sh:1117: Successful get rc redis-master {{.spec.replicas}}: 4
I0712 08:48:14.153] core.sh:1118: Successful get rc redis-slave {{.spec.replicas}}: 4
I0712 08:48:14.153] replicationcontroller "redis-master" deleted
I0712 08:48:14.153] replicationcontroller "redis-slave" deleted
... skipping 10 lines ...
I0712 08:48:14.572] deployment.apps "nginx-deployment" deleted
I0712 08:48:14.667] Successful
I0712 08:48:14.668] message:service/expose-test-deployment exposed
I0712 08:48:14.668] has:service/expose-test-deployment exposed
I0712 08:48:14.753] service "expose-test-deployment" deleted
I0712 08:48:14.841] Successful
I0712 08:48:14.841] message:error: couldn't retrieve selectors via --selector flag or introspection: invalid deployment: no selectors, therefore cannot be exposed
I0712 08:48:14.841] See 'kubectl expose -h' for help and examples
I0712 08:48:14.842] has:invalid deployment: no selectors
W0712 08:48:14.942] E0712 08:48:14.779445   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:14.943] E0712 08:48:14.866045   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:14.957] E0712 08:48:14.956713   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:14.993] I0712 08:48:14.992759   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921290-25107", Name:"nginx-deployment", UID:"b30e8a35-ff4b-42bc-81ca-ebf79efa1825", APIVersion:"apps/v1", ResourceVersion:"1643", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-59f44f48d4 to 3
W0712 08:48:14.999] I0712 08:48:14.998352   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921290-25107", Name:"nginx-deployment-59f44f48d4", UID:"f74205f1-e5ba-4477-b99f-2e5fe6112890", APIVersion:"apps/v1", ResourceVersion:"1644", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-59f44f48d4-n2w8g
W0712 08:48:15.004] I0712 08:48:15.003739   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921290-25107", Name:"nginx-deployment-59f44f48d4", UID:"f74205f1-e5ba-4477-b99f-2e5fe6112890", APIVersion:"apps/v1", ResourceVersion:"1644", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-59f44f48d4-6whq5
W0712 08:48:15.006] I0712 08:48:15.005581   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921290-25107", Name:"nginx-deployment-59f44f48d4", UID:"f74205f1-e5ba-4477-b99f-2e5fe6112890", APIVersion:"apps/v1", ResourceVersion:"1644", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-59f44f48d4-5dn65
W0712 08:48:15.052] E0712 08:48:15.051825   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:48:15.153] deployment.apps/nginx-deployment created
I0712 08:48:15.153] core.sh:1146: Successful get deployment nginx-deployment {{.spec.replicas}}: 3
I0712 08:48:15.171] service/nginx-deployment exposed
I0712 08:48:15.259] core.sh:1150: Successful get service nginx-deployment {{(index .spec.ports 0).port}}: 80
I0712 08:48:15.344] deployment.apps "nginx-deployment" deleted
I0712 08:48:15.354] service "nginx-deployment" deleted
... skipping 14 lines ...
I0712 08:48:16.877] service "frontend" deleted
I0712 08:48:16.884] service "frontend-2" deleted
I0712 08:48:16.892] service "frontend-3" deleted
I0712 08:48:16.900] service "frontend-4" deleted
I0712 08:48:16.908] service "frontend-5" deleted
I0712 08:48:16.996] Successful
I0712 08:48:16.996] message:error: cannot expose a Node
I0712 08:48:16.996] has:cannot expose
I0712 08:48:17.092] Successful
I0712 08:48:17.092] message:The Service "invalid-large-service-name-that-has-more-than-sixty-three-characters" is invalid: metadata.name: Invalid value: "invalid-large-service-name-that-has-more-than-sixty-three-characters": must be no more than 63 characters
I0712 08:48:17.092] has:metadata.name: Invalid value
I0712 08:48:17.189] Successful
I0712 08:48:17.189] message:service/kubernetes-serve-hostname-testing-sixty-three-characters-in-len exposed
... skipping 10 lines ...
I0712 08:48:17.892] core.sh:1219: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:48:17.971] core.sh:1223: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:48:18.113] replicationcontroller/frontend created
W0712 08:48:18.215] I0712 08:48:15.512908   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921290-25107", Name:"frontend", UID:"ec3d70b6-8156-47f8-9cc8-1f5d07f1fa0d", APIVersion:"v1", ResourceVersion:"1672", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-ml5bl
W0712 08:48:18.216] I0712 08:48:15.518356   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921290-25107", Name:"frontend", UID:"ec3d70b6-8156-47f8-9cc8-1f5d07f1fa0d", APIVersion:"v1", ResourceVersion:"1672", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-2gnwr
W0712 08:48:18.216] I0712 08:48:15.518906   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921290-25107", Name:"frontend", UID:"ec3d70b6-8156-47f8-9cc8-1f5d07f1fa0d", APIVersion:"v1", ResourceVersion:"1672", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-qr2wk
W0712 08:48:18.216] E0712 08:48:15.781003   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:18.217] E0712 08:48:15.867422   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:18.217] E0712 08:48:15.958202   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:18.217] E0712 08:48:16.053082   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:18.218] E0712 08:48:16.782440   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:18.218] E0712 08:48:16.869135   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:18.218] E0712 08:48:16.959673   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:18.219] E0712 08:48:17.054337   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:18.219] E0712 08:48:17.784496   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:18.220] E0712 08:48:17.870866   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:18.220] E0712 08:48:17.961078   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:18.221] E0712 08:48:18.055834   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:18.221] I0712 08:48:18.118695   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921290-25107", Name:"frontend", UID:"97726132-9b47-427b-90c9-695c6ad98123", APIVersion:"v1", ResourceVersion:"1734", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-p9hhl
W0712 08:48:18.222] I0712 08:48:18.123811   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921290-25107", Name:"frontend", UID:"97726132-9b47-427b-90c9-695c6ad98123", APIVersion:"v1", ResourceVersion:"1734", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-5mkzt
W0712 08:48:18.222] I0712 08:48:18.124247   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921290-25107", Name:"frontend", UID:"97726132-9b47-427b-90c9-695c6ad98123", APIVersion:"v1", ResourceVersion:"1734", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-lb4bd
W0712 08:48:18.294] I0712 08:48:18.293748   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921290-25107", Name:"redis-slave", UID:"e9e217ee-874d-440f-867c-019b2308504a", APIVersion:"v1", ResourceVersion:"1743", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-llslz
W0712 08:48:18.301] I0712 08:48:18.301271   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921290-25107", Name:"redis-slave", UID:"e9e217ee-874d-440f-867c-019b2308504a", APIVersion:"v1", ResourceVersion:"1743", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-qf5vj
I0712 08:48:18.402] replicationcontroller/redis-slave created
... skipping 8 lines ...
I0712 08:48:19.054] horizontalpodautoscaler.autoscaling/frontend autoscaled
I0712 08:48:19.140] core.sh:1246: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 70
I0712 08:48:19.224] horizontalpodautoscaler.autoscaling "frontend" deleted
I0712 08:48:19.312] horizontalpodautoscaler.autoscaling/frontend autoscaled
I0712 08:48:19.393] core.sh:1250: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
I0712 08:48:19.461] horizontalpodautoscaler.autoscaling "frontend" deleted
W0712 08:48:19.562] E0712 08:48:18.786393   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:19.562] E0712 08:48:18.872173   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:19.562] I0712 08:48:18.883008   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921290-25107", Name:"frontend", UID:"9417e307-1449-4fd2-8003-6e2e0dbb6bd1", APIVersion:"v1", ResourceVersion:"1762", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-gf6rf
W0712 08:48:19.563] I0712 08:48:18.886826   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921290-25107", Name:"frontend", UID:"9417e307-1449-4fd2-8003-6e2e0dbb6bd1", APIVersion:"v1", ResourceVersion:"1762", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-42qk2
W0712 08:48:19.563] I0712 08:48:18.889217   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921290-25107", Name:"frontend", UID:"9417e307-1449-4fd2-8003-6e2e0dbb6bd1", APIVersion:"v1", ResourceVersion:"1762", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-qjz74
W0712 08:48:19.563] E0712 08:48:18.962759   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:19.564] E0712 08:48:19.056600   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:19.564] Error: required flag(s) "max" not set
W0712 08:48:19.564] 
W0712 08:48:19.564] 
W0712 08:48:19.564] Examples:
W0712 08:48:19.564]   # Auto scale a deployment "foo", with the number of pods between 2 and 10, no target CPU utilization specified so a default autoscaling policy will be used:
W0712 08:48:19.564]   kubectl autoscale deployment foo --min=2 --max=10
W0712 08:48:19.564]   
... skipping 54 lines ...
I0712 08:48:19.779]           limits:
I0712 08:48:19.780]             cpu: 300m
I0712 08:48:19.780]           requests:
I0712 08:48:19.780]             cpu: 300m
I0712 08:48:19.780]       terminationGracePeriodSeconds: 0
I0712 08:48:19.780] status: {}
W0712 08:48:19.881] E0712 08:48:19.788168   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:19.881] Error from server (NotFound): deployments.apps "nginx-deployment-resources" not found
W0712 08:48:19.881] E0712 08:48:19.873990   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:19.965] E0712 08:48:19.964371   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:20.020] I0712 08:48:20.019809   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921290-25107", Name:"nginx-deployment-resources", UID:"f80e7d42-29c9-41be-a33f-d69a986d5c6f", APIVersion:"apps/v1", ResourceVersion:"1785", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-6c4ccc4447 to 3
W0712 08:48:20.025] I0712 08:48:20.024344   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921290-25107", Name:"nginx-deployment-resources-6c4ccc4447", UID:"4906575c-284a-4e45-80f2-1b1a888812a0", APIVersion:"apps/v1", ResourceVersion:"1786", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-6c4ccc4447-q5bmh
W0712 08:48:20.030] I0712 08:48:20.029493   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921290-25107", Name:"nginx-deployment-resources-6c4ccc4447", UID:"4906575c-284a-4e45-80f2-1b1a888812a0", APIVersion:"apps/v1", ResourceVersion:"1786", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-6c4ccc4447-z5x4j
W0712 08:48:20.031] I0712 08:48:20.030785   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921290-25107", Name:"nginx-deployment-resources-6c4ccc4447", UID:"4906575c-284a-4e45-80f2-1b1a888812a0", APIVersion:"apps/v1", ResourceVersion:"1786", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-6c4ccc4447-mmkf9
W0712 08:48:20.058] E0712 08:48:20.057596   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:48:20.158] deployment.apps/nginx-deployment-resources created
I0712 08:48:20.159] core.sh:1265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment-resources:
I0712 08:48:20.204] core.sh:1266: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0712 08:48:20.295] core.sh:1267: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
I0712 08:48:20.378] deployment.apps/nginx-deployment-resources resource requirements updated
W0712 08:48:20.479] I0712 08:48:20.384204   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921290-25107", Name:"nginx-deployment-resources", UID:"f80e7d42-29c9-41be-a33f-d69a986d5c6f", APIVersion:"apps/v1", ResourceVersion:"1799", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-6489868555 to 1
W0712 08:48:20.479] I0712 08:48:20.388154   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921290-25107", Name:"nginx-deployment-resources-6489868555", UID:"5889bcd6-b5ce-4db4-a5d1-481346755aef", APIVersion:"apps/v1", ResourceVersion:"1800", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-6489868555-jbrfm
I0712 08:48:20.580] core.sh:1270: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 100m:
I0712 08:48:20.593] core.sh:1271: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m:
I0712 08:48:20.772] deployment.apps/nginx-deployment-resources resource requirements updated
W0712 08:48:20.873] error: unable to find container named redis
W0712 08:48:20.874] E0712 08:48:20.789403   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:20.875] I0712 08:48:20.794389   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921290-25107", Name:"nginx-deployment-resources", UID:"f80e7d42-29c9-41be-a33f-d69a986d5c6f", APIVersion:"apps/v1", ResourceVersion:"1809", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-resources-6489868555 to 0
W0712 08:48:20.875] I0712 08:48:20.811409   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921290-25107", Name:"nginx-deployment-resources-6489868555", UID:"5889bcd6-b5ce-4db4-a5d1-481346755aef", APIVersion:"apps/v1", ResourceVersion:"1813", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-resources-6489868555-jbrfm
W0712 08:48:20.876] I0712 08:48:20.818719   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921290-25107", Name:"nginx-deployment-resources", UID:"f80e7d42-29c9-41be-a33f-d69a986d5c6f", APIVersion:"apps/v1", ResourceVersion:"1812", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-7f57ff9d5c to 1
W0712 08:48:20.877] I0712 08:48:20.826230   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921290-25107", Name:"nginx-deployment-resources-7f57ff9d5c", UID:"d60487e1-4bf0-468b-b651-2e4a4d0ee1b4", APIVersion:"apps/v1", ResourceVersion:"1819", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-7f57ff9d5c-ztqdn
W0712 08:48:20.877] E0712 08:48:20.875154   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:20.966] E0712 08:48:20.965675   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:21.061] E0712 08:48:21.060379   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:21.080] I0712 08:48:21.079170   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921290-25107", Name:"nginx-deployment-resources", UID:"f80e7d42-29c9-41be-a33f-d69a986d5c6f", APIVersion:"apps/v1", ResourceVersion:"1828", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-resources-6c4ccc4447 to 2
W0712 08:48:21.086] I0712 08:48:21.085787   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921290-25107", Name:"nginx-deployment-resources-6c4ccc4447", UID:"4906575c-284a-4e45-80f2-1b1a888812a0", APIVersion:"apps/v1", ResourceVersion:"1833", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-resources-6c4ccc4447-q5bmh
W0712 08:48:21.100] I0712 08:48:21.099254   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921290-25107", Name:"nginx-deployment-resources", UID:"f80e7d42-29c9-41be-a33f-d69a986d5c6f", APIVersion:"apps/v1", ResourceVersion:"1832", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-7f4b759546 to 1
W0712 08:48:21.107] I0712 08:48:21.106211   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921290-25107", Name:"nginx-deployment-resources-7f4b759546", UID:"d0d4d876-0572-411b-a0b0-8e6fa27b5038", APIVersion:"apps/v1", ResourceVersion:"1839", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-7f4b759546-grvdc
I0712 08:48:21.207] core.sh:1276: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
I0712 08:48:21.208] core.sh:1277: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m:
... skipping 196 lines ...
I0712 08:48:21.478]     status: "True"
I0712 08:48:21.478]     type: Progressing
I0712 08:48:21.478]   observedGeneration: 4
I0712 08:48:21.478]   replicas: 4
I0712 08:48:21.478]   unavailableReplicas: 4
I0712 08:48:21.478]   updatedReplicas: 1
W0712 08:48:21.579] error: you must specify resources by --filename when --local is set.
W0712 08:48:21.579] Example resource specifications include:
W0712 08:48:21.579]    '-f rsrc.yaml'
W0712 08:48:21.579]    '--filename=rsrc.json'
I0712 08:48:21.680] core.sh:1286: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
I0712 08:48:21.726] core.sh:1287: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 300m:
I0712 08:48:21.812] core.sh:1288: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.requests.cpu}}:{{end}}: 300m:
... skipping 23 lines ...
I0712 08:48:22.829] Successful
I0712 08:48:22.829] message:10
I0712 08:48:22.829] has:10
I0712 08:48:22.919] Successful
I0712 08:48:22.919] message:apps/v1
I0712 08:48:22.919] has:apps/v1
W0712 08:48:23.020] E0712 08:48:21.790628   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:23.021] E0712 08:48:21.876827   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:23.021] E0712 08:48:21.967014   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:23.021] E0712 08:48:22.061777   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:23.022] I0712 08:48:22.233615   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921302-5904", Name:"test-nginx-extensions", UID:"97152c0d-3409-4906-be76-347337073006", APIVersion:"apps/v1", ResourceVersion:"1865", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-nginx-extensions-785ccd54b9 to 1
W0712 08:48:23.022] I0712 08:48:22.239974   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921302-5904", Name:"test-nginx-extensions-785ccd54b9", UID:"b626bb60-cf6c-4eff-bb2b-d7aa5128118e", APIVersion:"apps/v1", ResourceVersion:"1866", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-nginx-extensions-785ccd54b9-qnlg4
W0712 08:48:23.023] I0712 08:48:22.656310   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921302-5904", Name:"test-nginx-apps", UID:"13431f6d-226d-4cec-805e-261b3b0734ff", APIVersion:"apps/v1", ResourceVersion:"1876", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-nginx-apps-5d9fc459f to 1
W0712 08:48:23.023] I0712 08:48:22.660739   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921302-5904", Name:"test-nginx-apps-5d9fc459f", UID:"b68cb539-2455-4300-8402-48885d1cba5d", APIVersion:"apps/v1", ResourceVersion:"1877", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-nginx-apps-5d9fc459f-5mlcq
W0712 08:48:23.024] E0712 08:48:22.794170   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:23.024] E0712 08:48:22.878161   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:23.024] E0712 08:48:22.968091   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:23.064] E0712 08:48:23.063412   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:48:23.165] Successful describe rs:
I0712 08:48:23.165] Name:           test-nginx-apps-5d9fc459f
I0712 08:48:23.166] Namespace:      namespace-1562921302-5904
I0712 08:48:23.166] Selector:       app=test-nginx-apps,pod-template-hash=5d9fc459f
I0712 08:48:23.166] Labels:         app=test-nginx-apps
I0712 08:48:23.166]                 pod-template-hash=5d9fc459f
I0712 08:48:23.167] Annotations:    deployment.kubernetes.io/desired-replicas: 1
I0712 08:48:23.167]                 deployment.kubernetes.io/max-replicas: 2
I0712 08:48:23.167]                 deployment.kubernetes.io/revision: 1
I0712 08:48:23.167] Controlled By:  Deployment/test-nginx-apps
I0712 08:48:23.168] Replicas:       1 current / 1 desired
I0712 08:48:23.168] Pods Status:    0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0712 08:48:23.168] Pod Template:
I0712 08:48:23.168]   Labels:  app=test-nginx-apps
I0712 08:48:23.169]            pod-template-hash=5d9fc459f
I0712 08:48:23.169]   Containers:
I0712 08:48:23.169]    nginx:
I0712 08:48:23.169]     Image:        k8s.gcr.io/nginx:test-cmd
... skipping 48 lines ...
I0712 08:48:24.653] apps.sh:247: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:48:24.732] deployment.apps/nginx-deployment created
I0712 08:48:24.833] apps.sh:251: Successful get rs {{range.items}}{{.spec.replicas}}{{end}}: 1
I0712 08:48:24.914] deployment.apps "nginx-deployment" deleted
W0712 08:48:25.014] I0712 08:48:23.407110   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921302-5904", Name:"nginx-with-command", UID:"e03698c2-f8cd-4170-bfdd-f20fd8b12b72", APIVersion:"apps/v1", ResourceVersion:"1895", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-with-command-7c48b558b to 1
W0712 08:48:25.015] I0712 08:48:23.410428   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921302-5904", Name:"nginx-with-command-7c48b558b", UID:"2f2cecb2-23bb-46e9-af2b-f62f2ef51acb", APIVersion:"apps/v1", ResourceVersion:"1896", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-with-command-7c48b558b-qjt4n
W0712 08:48:25.015] E0712 08:48:23.795846   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:25.016] I0712 08:48:23.795959   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921302-5904", Name:"deployment-with-unixuserid", UID:"52521e22-091b-44c4-a0ba-feb6c7951afc", APIVersion:"apps/v1", ResourceVersion:"1909", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set deployment-with-unixuserid-54bb96774f to 1
W0712 08:48:25.016] I0712 08:48:23.800222   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921302-5904", Name:"deployment-with-unixuserid-54bb96774f", UID:"88dc2fab-60ef-45a5-bce9-8a4f541feaed", APIVersion:"apps/v1", ResourceVersion:"1910", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: deployment-with-unixuserid-54bb96774f-4wwbc
W0712 08:48:25.016] E0712 08:48:23.879300   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:25.016] E0712 08:48:23.969936   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:25.017] E0712 08:48:24.064782   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:25.017] I0712 08:48:24.211466   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921302-5904", Name:"nginx-deployment", UID:"a99c846e-6fa4-4d17-ab95-00afdc308dd4", APIVersion:"apps/v1", ResourceVersion:"1923", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-59f44f48d4 to 3
W0712 08:48:25.017] I0712 08:48:24.223111   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921302-5904", Name:"nginx-deployment-59f44f48d4", UID:"bd532507-a898-4607-8ce8-f1c788ab758d", APIVersion:"apps/v1", ResourceVersion:"1924", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-59f44f48d4-78kbf
W0712 08:48:25.017] I0712 08:48:24.228598   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921302-5904", Name:"nginx-deployment-59f44f48d4", UID:"bd532507-a898-4607-8ce8-f1c788ab758d", APIVersion:"apps/v1", ResourceVersion:"1924", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-59f44f48d4-cshmd
W0712 08:48:25.018] I0712 08:48:24.236463   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921302-5904", Name:"nginx-deployment-59f44f48d4", UID:"bd532507-a898-4607-8ce8-f1c788ab758d", APIVersion:"apps/v1", ResourceVersion:"1924", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-59f44f48d4-j6mzb
W0712 08:48:25.018] I0712 08:48:24.737160   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921302-5904", Name:"nginx-deployment", UID:"797cc06a-0d8a-41cf-97f8-a45fadbb8f06", APIVersion:"apps/v1", ResourceVersion:"1945", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-7549954bdc to 1
W0712 08:48:25.018] I0712 08:48:24.739573   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921302-5904", Name:"nginx-deployment-7549954bdc", UID:"8769ab28-1f22-4930-962e-c5fafce6d4be", APIVersion:"apps/v1", ResourceVersion:"1946", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-7549954bdc-ck4dr
W0712 08:48:25.019] E0712 08:48:24.797410   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:25.019] E0712 08:48:24.880822   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:25.019] E0712 08:48:24.971449   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:25.066] E0712 08:48:25.066065   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:48:25.167] apps.sh:256: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:48:25.167] apps.sh:257: Successful get rs {{range.items}}{{.spec.replicas}}{{end}}: 1
I0712 08:48:25.284] replicaset.apps "nginx-deployment-7549954bdc" deleted
I0712 08:48:25.373] apps.sh:265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:48:25.528] deployment.apps/nginx-deployment created
W0712 08:48:25.629] I0712 08:48:25.532369   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921302-5904", Name:"nginx-deployment", UID:"477bb51f-6236-46ce-bace-59b25c902d2b", APIVersion:"apps/v1", ResourceVersion:"1964", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-59f44f48d4 to 3
... skipping 4 lines ...
I0712 08:48:25.732] horizontalpodautoscaler.autoscaling/nginx-deployment autoscaled
I0712 08:48:25.804] apps.sh:271: Successful get hpa nginx-deployment {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
I0712 08:48:25.885] horizontalpodautoscaler.autoscaling "nginx-deployment" deleted
I0712 08:48:25.956] deployment.apps "nginx-deployment" deleted
I0712 08:48:26.054] apps.sh:279: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:48:26.215] deployment.apps/nginx created
W0712 08:48:26.316] E0712 08:48:25.798370   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:26.316] E0712 08:48:25.881764   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:26.316] E0712 08:48:25.972369   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:26.316] E0712 08:48:26.066918   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:26.317] I0712 08:48:26.220671   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921302-5904", Name:"nginx", UID:"2a21dc44-f400-4515-9b63-d715701a28b1", APIVersion:"apps/v1", ResourceVersion:"1988", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-7bb858d887 to 3
W0712 08:48:26.317] I0712 08:48:26.223278   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921302-5904", Name:"nginx-7bb858d887", UID:"c8abe4dc-fa2e-4363-80e3-38015417c708", APIVersion:"apps/v1", ResourceVersion:"1989", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-7bb858d887-4qn84
W0712 08:48:26.317] I0712 08:48:26.226988   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921302-5904", Name:"nginx-7bb858d887", UID:"c8abe4dc-fa2e-4363-80e3-38015417c708", APIVersion:"apps/v1", ResourceVersion:"1989", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-7bb858d887-qwvbw
W0712 08:48:26.318] I0712 08:48:26.228237   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921302-5904", Name:"nginx-7bb858d887", UID:"c8abe4dc-fa2e-4363-80e3-38015417c708", APIVersion:"apps/v1", ResourceVersion:"1989", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-7bb858d887-rcfjb
I0712 08:48:26.418] apps.sh:283: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx:
I0712 08:48:26.419] apps.sh:284: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
... skipping 4 lines ...
I0712 08:48:26.939]     Image:	k8s.gcr.io/nginx:test-cmd
I0712 08:48:27.031] apps.sh:293: Successful get deployment.apps {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
I0712 08:48:27.132] deployment.apps/nginx rolled back
W0712 08:48:27.232] Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
W0712 08:48:27.233] I0712 08:48:26.752640   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921302-5904", Name:"nginx", UID:"2a21dc44-f400-4515-9b63-d715701a28b1", APIVersion:"apps/v1", ResourceVersion:"2002", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-65d8657685 to 1
W0712 08:48:27.233] I0712 08:48:26.758204   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921302-5904", Name:"nginx-65d8657685", UID:"9e663605-adff-458a-958b-7fad1082d18c", APIVersion:"apps/v1", ResourceVersion:"2003", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-65d8657685-h6ff8
W0712 08:48:27.234] E0712 08:48:26.800991   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:27.234] E0712 08:48:26.883153   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:27.234] E0712 08:48:26.973867   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:27.234] E0712 08:48:27.068660   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:27.803] E0712 08:48:27.802568   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:27.885] E0712 08:48:27.884773   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:27.976] E0712 08:48:27.975443   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:28.070] E0712 08:48:28.070026   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:48:28.223] apps.sh:297: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0712 08:48:28.410] apps.sh:300: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0712 08:48:28.517] deployment.apps/nginx rolled back
W0712 08:48:28.617] error: unable to find specified revision 1000000 in history
W0712 08:48:28.804] E0712 08:48:28.803943   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:28.887] E0712 08:48:28.886301   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:28.977] E0712 08:48:28.976769   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:29.072] E0712 08:48:29.071586   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:48:29.604] apps.sh:304: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
I0712 08:48:29.689] deployment.apps/nginx paused
W0712 08:48:29.789] error: you cannot rollback a paused deployment; resume it first with 'kubectl rollout resume deployment/nginx' and try again
W0712 08:48:29.805] E0712 08:48:29.805382   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:29.877] error: deployments.apps "nginx" can't restart paused deployment (run rollout resume first)
W0712 08:48:29.888] E0712 08:48:29.887534   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:29.978] E0712 08:48:29.978095   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:30.073] E0712 08:48:30.072752   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:48:30.174] deployment.apps/nginx resumed
I0712 08:48:30.174] deployment.apps/nginx rolled back
I0712 08:48:30.236]     deployment.kubernetes.io/revision-history: 1,3
W0712 08:48:30.417] error: desired revision (3) is different from the running revision (5)
I0712 08:48:30.518] deployment.apps/nginx restarted
W0712 08:48:30.619] I0712 08:48:30.525639   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921302-5904", Name:"nginx", UID:"2a21dc44-f400-4515-9b63-d715701a28b1", APIVersion:"apps/v1", ResourceVersion:"2032", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-65d8657685 to 0
W0712 08:48:30.619] I0712 08:48:30.539444   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921302-5904", Name:"nginx-65d8657685", UID:"9e663605-adff-458a-958b-7fad1082d18c", APIVersion:"apps/v1", ResourceVersion:"2036", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-65d8657685-h6ff8
W0712 08:48:30.620] I0712 08:48:30.543868   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921302-5904", Name:"nginx", UID:"2a21dc44-f400-4515-9b63-d715701a28b1", APIVersion:"apps/v1", ResourceVersion:"2035", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-75fdff5585 to 1
W0712 08:48:30.620] I0712 08:48:30.549838   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921302-5904", Name:"nginx-75fdff5585", UID:"a16110bb-4e1e-4829-b9d5-fcdcf4e71716", APIVersion:"apps/v1", ResourceVersion:"2041", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-75fdff5585-2j6cr
W0712 08:48:30.807] E0712 08:48:30.806868   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:30.889] E0712 08:48:30.889105   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:30.980] E0712 08:48:30.979663   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:31.075] E0712 08:48:31.074186   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:48:31.694] Successful
I0712 08:48:31.694] message:apiVersion: apps/v1
I0712 08:48:31.694] kind: ReplicaSet
I0712 08:48:31.694] metadata:
I0712 08:48:31.694]   annotations:
I0712 08:48:31.694]     deployment.kubernetes.io/desired-replicas: "3"
... skipping 115 lines ...
I0712 08:48:31.710]       terminationGracePeriodSeconds: 30
I0712 08:48:31.710] status:
I0712 08:48:31.710]   fullyLabeledReplicas: 1
I0712 08:48:31.710]   observedGeneration: 2
I0712 08:48:31.710]   replicas: 1
I0712 08:48:31.710] has:deployment.kubernetes.io/revision: "6"
W0712 08:48:31.811] E0712 08:48:31.808099   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:31.859] I0712 08:48:31.858535   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921302-5904", Name:"nginx2", UID:"81f1b863-a0c3-4dc8-8332-e609e5e23381", APIVersion:"apps/v1", ResourceVersion:"2053", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx2-685576569f to 3
W0712 08:48:31.862] I0712 08:48:31.861705   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921302-5904", Name:"nginx2-685576569f", UID:"4c56eccf-598e-4617-a841-cd62c57c44aa", APIVersion:"apps/v1", ResourceVersion:"2054", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-685576569f-dk6m2
W0712 08:48:31.867] I0712 08:48:31.866939   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921302-5904", Name:"nginx2-685576569f", UID:"4c56eccf-598e-4617-a841-cd62c57c44aa", APIVersion:"apps/v1", ResourceVersion:"2054", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-685576569f-njpvt
W0712 08:48:31.868] I0712 08:48:31.867208   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921302-5904", Name:"nginx2-685576569f", UID:"4c56eccf-598e-4617-a841-cd62c57c44aa", APIVersion:"apps/v1", ResourceVersion:"2054", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-685576569f-zmst6
W0712 08:48:31.891] E0712 08:48:31.890326   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:31.981] E0712 08:48:31.980914   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:32.076] E0712 08:48:32.075478   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:48:32.176] deployment.apps/nginx2 created
I0712 08:48:32.177] deployment.apps "nginx2" deleted
I0712 08:48:32.177] deployment.apps "nginx" deleted
I0712 08:48:32.177] apps.sh:334: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:48:32.256] deployment.apps/nginx-deployment created
W0712 08:48:32.356] I0712 08:48:32.261183   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921302-5904", Name:"nginx-deployment", UID:"36839ae3-931d-4125-9cda-257f9f19d9d3", APIVersion:"apps/v1", ResourceVersion:"2087", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-68c65c466b to 3
... skipping 14 lines ...
I0712 08:48:33.459] apps.sh:353: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
I0712 08:48:33.608] apps.sh:356: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
I0712 08:48:33.685] apps.sh:357: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
I0712 08:48:33.777] deployment.apps/nginx-deployment image updated
W0712 08:48:33.878] I0712 08:48:32.607473   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921302-5904", Name:"nginx-deployment", UID:"36839ae3-931d-4125-9cda-257f9f19d9d3", APIVersion:"apps/v1", ResourceVersion:"2101", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-797bfd8654 to 1
W0712 08:48:33.878] I0712 08:48:32.610533   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921302-5904", Name:"nginx-deployment-797bfd8654", UID:"85faac52-4de6-4f4e-b59b-9f2420ed3fe5", APIVersion:"apps/v1", ResourceVersion:"2102", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-797bfd8654-249tn
W0712 08:48:33.878] E0712 08:48:32.809522   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:33.879] E0712 08:48:32.891755   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:33.879] error: unable to find container named "redis"
W0712 08:48:33.879] E0712 08:48:32.982194   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:33.879] E0712 08:48:33.076871   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:33.879] I0712 08:48:33.804292   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921302-5904", Name:"nginx-deployment", UID:"36839ae3-931d-4125-9cda-257f9f19d9d3", APIVersion:"apps/v1", ResourceVersion:"2121", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-68c65c466b to 2
W0712 08:48:33.880] E0712 08:48:33.816822   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:33.880] I0712 08:48:33.818221   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921302-5904", Name:"nginx-deployment-68c65c466b", UID:"3bdcf269-b06b-47ad-8789-f5adfe2a2d9a", APIVersion:"apps/v1", ResourceVersion:"2125", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-68c65c466b-cblrm
W0712 08:48:33.880] I0712 08:48:33.835050   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921302-5904", Name:"nginx-deployment", UID:"36839ae3-931d-4125-9cda-257f9f19d9d3", APIVersion:"apps/v1", ResourceVersion:"2124", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-8555bfdd to 1
W0712 08:48:33.881] I0712 08:48:33.841660   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921302-5904", Name:"nginx-deployment-8555bfdd", UID:"1c0cd716-dcf1-4c7b-932e-0928f9594433", APIVersion:"apps/v1", ResourceVersion:"2131", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-8555bfdd-672hg
W0712 08:48:33.893] E0712 08:48:33.892641   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:33.983] E0712 08:48:33.983331   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:34.054] I0712 08:48:34.054339   51895 horizontal.go:341] Horizontal Pod Autoscaler frontend has been deleted in namespace-1562921290-25107
W0712 08:48:34.078] E0712 08:48:34.078035   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:48:34.179] apps.sh:360: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0712 08:48:34.179] apps.sh:361: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0712 08:48:34.180] apps.sh:364: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0712 08:48:34.224] apps.sh:365: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0712 08:48:34.308] deployment.apps "nginx-deployment" deleted
I0712 08:48:34.396] apps.sh:371: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
... skipping 11 lines ...
I0712 08:48:35.227] apps.sh:383: Successful get deploy nginx-deployment {{ (index (index .spec.template.spec.containers 0).env 0).name}}: KEY_2
I0712 08:48:35.304] apps.sh:385: Successful get deploy nginx-deployment {{ len (index .spec.template.spec.containers 0).env }}: 1
I0712 08:48:35.398] deployment.apps/nginx-deployment env updated
I0712 08:48:35.489] apps.sh:389: Successful get deploy nginx-deployment {{ len (index .spec.template.spec.containers 0).env }}: 2
I0712 08:48:35.577] deployment.apps/nginx-deployment env updated
I0712 08:48:35.675] deployment.apps/nginx-deployment env updated
W0712 08:48:35.776] E0712 08:48:34.818657   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:35.776] E0712 08:48:34.893921   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:35.777] E0712 08:48:34.984867   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:35.777] E0712 08:48:35.079379   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:35.777] I0712 08:48:35.150942   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921302-5904", Name:"nginx-deployment", UID:"95772185-5a32-4a1c-9311-9c2b66e14ee3", APIVersion:"apps/v1", ResourceVersion:"2170", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-7cc69b4cf4 to 1
W0712 08:48:35.778] I0712 08:48:35.154862   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921302-5904", Name:"nginx-deployment-7cc69b4cf4", UID:"22314435-8794-4021-b13a-78911807a5a8", APIVersion:"apps/v1", ResourceVersion:"2171", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-7cc69b4cf4-9sxsm
W0712 08:48:35.778] I0712 08:48:35.424707   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921302-5904", Name:"nginx-deployment", UID:"95772185-5a32-4a1c-9311-9c2b66e14ee3", APIVersion:"apps/v1", ResourceVersion:"2180", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-68c65c466b to 2
W0712 08:48:35.778] I0712 08:48:35.431041   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921302-5904", Name:"nginx-deployment-68c65c466b", UID:"1faa28f2-3cec-4ad2-b3fb-e7e09881bc67", APIVersion:"apps/v1", ResourceVersion:"2184", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-68c65c466b-6x2kk
W0712 08:48:35.779] I0712 08:48:35.449137   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921302-5904", Name:"nginx-deployment", UID:"95772185-5a32-4a1c-9311-9c2b66e14ee3", APIVersion:"apps/v1", ResourceVersion:"2183", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6f4fd5846d to 1
W0712 08:48:35.779] I0712 08:48:35.455918   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921302-5904", Name:"nginx-deployment-6f4fd5846d", UID:"b06d120e-84f7-436f-9c4a-cc184b865dbd", APIVersion:"apps/v1", ResourceVersion:"2191", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6f4fd5846d-npjqk
... skipping 2 lines ...
W0712 08:48:35.780] I0712 08:48:35.620073   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921302-5904", Name:"nginx-deployment", UID:"95772185-5a32-4a1c-9311-9c2b66e14ee3", APIVersion:"apps/v1", ResourceVersion:"2203", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-ffc5b56c4 to 1
W0712 08:48:35.780] I0712 08:48:35.625593   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921302-5904", Name:"nginx-deployment-ffc5b56c4", UID:"d30a6d3c-a912-480c-a9dd-434ba32cc422", APIVersion:"apps/v1", ResourceVersion:"2211", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-ffc5b56c4-6qbsc
W0712 08:48:35.781] I0712 08:48:35.700807   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921302-5904", Name:"nginx-deployment", UID:"95772185-5a32-4a1c-9311-9c2b66e14ee3", APIVersion:"apps/v1", ResourceVersion:"2220", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-68c65c466b to 0
W0712 08:48:35.781] I0712 08:48:35.707235   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921302-5904", Name:"nginx-deployment-68c65c466b", UID:"1faa28f2-3cec-4ad2-b3fb-e7e09881bc67", APIVersion:"apps/v1", ResourceVersion:"2224", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-68c65c466b-nznsj
W0712 08:48:35.781] I0712 08:48:35.723656   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921302-5904", Name:"nginx-deployment", UID:"95772185-5a32-4a1c-9311-9c2b66e14ee3", APIVersion:"apps/v1", ResourceVersion:"2223", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-55d4d6d4bf to 1
W0712 08:48:35.782] I0712 08:48:35.729608   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921302-5904", Name:"nginx-deployment-55d4d6d4bf", UID:"55553d4d-08bb-4d8d-9173-a8b6574427ab", APIVersion:"apps/v1", ResourceVersion:"2230", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-55d4d6d4bf-xtxrq
W0712 08:48:35.820] E0712 08:48:35.819795   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:35.895] E0712 08:48:35.895175   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:35.954] I0712 08:48:35.953677   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921302-5904", Name:"nginx-deployment", UID:"95772185-5a32-4a1c-9311-9c2b66e14ee3", APIVersion:"apps/v1", ResourceVersion:"2240", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-55d4d6d4bf to 0
W0712 08:48:35.963] I0712 08:48:35.962721   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921302-5904", Name:"nginx-deployment-55d4d6d4bf", UID:"55553d4d-08bb-4d8d-9173-a8b6574427ab", APIVersion:"apps/v1", ResourceVersion:"2245", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-55d4d6d4bf-xtxrq
W0712 08:48:35.986] E0712 08:48:35.986080   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:36.081] E0712 08:48:36.080808   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:36.104] I0712 08:48:36.103948   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921302-5904", Name:"nginx-deployment", UID:"95772185-5a32-4a1c-9311-9c2b66e14ee3", APIVersion:"apps/v1", ResourceVersion:"2252", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-54c9b86fff to 1
W0712 08:48:36.110] I0712 08:48:36.110109   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921302-5904", Name:"nginx-deployment-54c9b86fff", UID:"042d5375-d058-4b1c-b793-1ed2dd8b8602", APIVersion:"apps/v1", ResourceVersion:"2253", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-54c9b86fff-cfwrb
W0712 08:48:36.142] E0712 08:48:36.141371   51895 replica_set.go:450] Sync "namespace-1562921302-5904/nginx-deployment-54c9b86fff" failed with Operation cannot be fulfilled on replicasets.apps "nginx-deployment-54c9b86fff": StorageError: invalid object, Code: 4, Key: /registry/replicasets/namespace-1562921302-5904/nginx-deployment-54c9b86fff, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 042d5375-d058-4b1c-b793-1ed2dd8b8602, UID in object meta: 
I0712 08:48:36.243] deployment.apps/nginx-deployment env updated
I0712 08:48:36.243] deployment.apps/nginx-deployment env updated
I0712 08:48:36.244] deployment.apps/nginx-deployment env updated
I0712 08:48:36.244] deployment.apps "nginx-deployment" deleted
I0712 08:48:36.244] configmap "test-set-env-config" deleted
I0712 08:48:36.280] secret "test-set-env-secret" deleted
... skipping 15 lines ...
I0712 08:48:36.960] apps.sh:517: Successful get pods -l "tier=frontend" {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:48:37.053] apps.sh:521: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:48:37.214] replicaset.apps/frontend-no-cascade created
W0712 08:48:37.315] I0712 08:48:36.771643   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921316-349", Name:"frontend", UID:"dab91f2b-8adc-49c6-b8fd-fec1754ad922", APIVersion:"apps/v1", ResourceVersion:"2278", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-6gtc7
W0712 08:48:37.316] I0712 08:48:36.775950   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921316-349", Name:"frontend", UID:"dab91f2b-8adc-49c6-b8fd-fec1754ad922", APIVersion:"apps/v1", ResourceVersion:"2278", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-qfnbd
W0712 08:48:37.316] I0712 08:48:36.776901   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921316-349", Name:"frontend", UID:"dab91f2b-8adc-49c6-b8fd-fec1754ad922", APIVersion:"apps/v1", ResourceVersion:"2278", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-z2n5g
W0712 08:48:37.316] E0712 08:48:36.822065   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:37.316] E0712 08:48:36.896623   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:37.317] E0712 08:48:36.987832   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:37.317] E0712 08:48:37.082330   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:37.317] I0712 08:48:37.221386   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921316-349", Name:"frontend-no-cascade", UID:"b4730e06-47c8-4fdb-918a-312ea36170f6", APIVersion:"apps/v1", ResourceVersion:"2295", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-no-cascade-f5p9r
W0712 08:48:37.317] I0712 08:48:37.227426   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921316-349", Name:"frontend-no-cascade", UID:"b4730e06-47c8-4fdb-918a-312ea36170f6", APIVersion:"apps/v1", ResourceVersion:"2295", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-no-cascade-jqrmv
W0712 08:48:37.318] I0712 08:48:37.227811   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921316-349", Name:"frontend-no-cascade", UID:"b4730e06-47c8-4fdb-918a-312ea36170f6", APIVersion:"apps/v1", ResourceVersion:"2295", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-no-cascade-sr4lx
I0712 08:48:37.418] apps.sh:527: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
I0712 08:48:37.419] +++ [0712 08:48:37] Deleting rs
I0712 08:48:37.419] replicaset.apps "frontend-no-cascade" deleted
... skipping 11 lines ...
I0712 08:48:38.307] Namespace:    namespace-1562921316-349
I0712 08:48:38.307] Selector:     app=guestbook,tier=frontend
I0712 08:48:38.307] Labels:       app=guestbook
I0712 08:48:38.307]               tier=frontend
I0712 08:48:38.308] Annotations:  <none>
I0712 08:48:38.308] Replicas:     3 current / 3 desired
I0712 08:48:38.308] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0712 08:48:38.308] Pod Template:
I0712 08:48:38.308]   Labels:  app=guestbook
I0712 08:48:38.308]            tier=frontend
I0712 08:48:38.308]   Containers:
I0712 08:48:38.308]    php-redis:
I0712 08:48:38.308]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
I0712 08:48:38.424] Namespace:    namespace-1562921316-349
I0712 08:48:38.424] Selector:     app=guestbook,tier=frontend
I0712 08:48:38.424] Labels:       app=guestbook
I0712 08:48:38.424]               tier=frontend
I0712 08:48:38.424] Annotations:  <none>
I0712 08:48:38.425] Replicas:     3 current / 3 desired
I0712 08:48:38.425] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0712 08:48:38.425] Pod Template:
I0712 08:48:38.425]   Labels:  app=guestbook
I0712 08:48:38.425]            tier=frontend
I0712 08:48:38.425]   Containers:
I0712 08:48:38.425]    php-redis:
I0712 08:48:38.425]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 10 lines ...
I0712 08:48:38.426]   Type    Reason            Age   From                   Message
I0712 08:48:38.426]   ----    ------            ----  ----                   -------
I0712 08:48:38.426]   Normal  SuccessfulCreate  0s    replicaset-controller  Created pod: frontend-sc2bn
I0712 08:48:38.427]   Normal  SuccessfulCreate  0s    replicaset-controller  Created pod: frontend-wf5gq
I0712 08:48:38.427]   Normal  SuccessfulCreate  0s    replicaset-controller  Created pod: frontend-xm865
I0712 08:48:38.427] 
W0712 08:48:38.527] E0712 08:48:37.823526   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:38.528] E0712 08:48:37.898167   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:38.528] E0712 08:48:37.989208   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:38.528] I0712 08:48:38.077056   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921316-349", Name:"frontend", UID:"19697ed6-2e16-46c6-a3ff-14135269943b", APIVersion:"apps/v1", ResourceVersion:"2318", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-sc2bn
W0712 08:48:38.529] I0712 08:48:38.083537   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921316-349", Name:"frontend", UID:"19697ed6-2e16-46c6-a3ff-14135269943b", APIVersion:"apps/v1", ResourceVersion:"2318", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-wf5gq
W0712 08:48:38.529] E0712 08:48:38.083744   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:38.529] I0712 08:48:38.084288   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921316-349", Name:"frontend", UID:"19697ed6-2e16-46c6-a3ff-14135269943b", APIVersion:"apps/v1", ResourceVersion:"2318", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-xm865
I0712 08:48:38.630] apps.sh:550: Successful describe
I0712 08:48:38.630] Name:         frontend
I0712 08:48:38.630] Namespace:    namespace-1562921316-349
I0712 08:48:38.630] Selector:     app=guestbook,tier=frontend
I0712 08:48:38.630] Labels:       app=guestbook
I0712 08:48:38.630]               tier=frontend
I0712 08:48:38.630] Annotations:  <none>
I0712 08:48:38.630] Replicas:     3 current / 3 desired
I0712 08:48:38.631] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0712 08:48:38.631] Pod Template:
I0712 08:48:38.631]   Labels:  app=guestbook
I0712 08:48:38.631]            tier=frontend
I0712 08:48:38.631]   Containers:
I0712 08:48:38.631]    php-redis:
I0712 08:48:38.631]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 12 lines ...
I0712 08:48:38.648] Namespace:    namespace-1562921316-349
I0712 08:48:38.648] Selector:     app=guestbook,tier=frontend
I0712 08:48:38.648] Labels:       app=guestbook
I0712 08:48:38.648]               tier=frontend
I0712 08:48:38.648] Annotations:  <none>
I0712 08:48:38.649] Replicas:     3 current / 3 desired
I0712 08:48:38.649] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0712 08:48:38.649] Pod Template:
I0712 08:48:38.649]   Labels:  app=guestbook
I0712 08:48:38.649]            tier=frontend
I0712 08:48:38.649]   Containers:
I0712 08:48:38.649]    php-redis:
I0712 08:48:38.650]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 18 lines ...
I0712 08:48:38.787] Namespace:    namespace-1562921316-349
I0712 08:48:38.787] Selector:     app=guestbook,tier=frontend
I0712 08:48:38.787] Labels:       app=guestbook
I0712 08:48:38.787]               tier=frontend
I0712 08:48:38.787] Annotations:  <none>
I0712 08:48:38.787] Replicas:     3 current / 3 desired
I0712 08:48:38.788] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0712 08:48:38.788] Pod Template:
I0712 08:48:38.788]   Labels:  app=guestbook
I0712 08:48:38.788]            tier=frontend
I0712 08:48:38.788]   Containers:
I0712 08:48:38.788]    php-redis:
I0712 08:48:38.788]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
I0712 08:48:38.899] Namespace:    namespace-1562921316-349
I0712 08:48:38.900] Selector:     app=guestbook,tier=frontend
I0712 08:48:38.900] Labels:       app=guestbook
I0712 08:48:38.900]               tier=frontend
I0712 08:48:38.900] Annotations:  <none>
I0712 08:48:38.900] Replicas:     3 current / 3 desired
I0712 08:48:38.900] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0712 08:48:38.900] Pod Template:
I0712 08:48:38.900]   Labels:  app=guestbook
I0712 08:48:38.901]            tier=frontend
I0712 08:48:38.901]   Containers:
I0712 08:48:38.901]    php-redis:
I0712 08:48:38.901]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
I0712 08:48:39.007] Namespace:    namespace-1562921316-349
I0712 08:48:39.008] Selector:     app=guestbook,tier=frontend
I0712 08:48:39.008] Labels:       app=guestbook
I0712 08:48:39.008]               tier=frontend
I0712 08:48:39.008] Annotations:  <none>
I0712 08:48:39.008] Replicas:     3 current / 3 desired
I0712 08:48:39.008] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0712 08:48:39.008] Pod Template:
I0712 08:48:39.008]   Labels:  app=guestbook
I0712 08:48:39.008]            tier=frontend
I0712 08:48:39.008]   Containers:
I0712 08:48:39.008]    php-redis:
I0712 08:48:39.009]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 11 lines ...
I0712 08:48:39.120] Namespace:    namespace-1562921316-349
I0712 08:48:39.121] Selector:     app=guestbook,tier=frontend
I0712 08:48:39.121] Labels:       app=guestbook
I0712 08:48:39.121]               tier=frontend
I0712 08:48:39.121] Annotations:  <none>
I0712 08:48:39.121] Replicas:     3 current / 3 desired
I0712 08:48:39.121] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0712 08:48:39.121] Pod Template:
I0712 08:48:39.121]   Labels:  app=guestbook
I0712 08:48:39.121]            tier=frontend
I0712 08:48:39.121]   Containers:
I0712 08:48:39.121]    php-redis:
I0712 08:48:39.122]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 99 lines ...
I0712 08:48:39.269] Tolerations:           <none>
I0712 08:48:39.269] Events:                <none>
I0712 08:48:39.353] apps.sh:566: Successful get rs frontend {{.spec.replicas}}: 3
I0712 08:48:39.450] replicaset.apps/frontend scaled
I0712 08:48:39.554] apps.sh:570: Successful get rs frontend {{.spec.replicas}}: 2
I0712 08:48:39.701] deployment.apps/scale-1 created
W0712 08:48:39.802] E0712 08:48:38.825201   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:39.802] E0712 08:48:38.900101   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:39.802] E0712 08:48:38.990698   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:39.803] E0712 08:48:39.085165   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:39.803] I0712 08:48:39.458550   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921316-349", Name:"frontend", UID:"19697ed6-2e16-46c6-a3ff-14135269943b", APIVersion:"apps/v1", ResourceVersion:"2328", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-sc2bn
W0712 08:48:39.803] I0712 08:48:39.706948   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921316-349", Name:"scale-1", UID:"d5c70243-85d8-4993-8f97-4b45bc297d42", APIVersion:"apps/v1", ResourceVersion:"2334", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-1-7bc75f7887 to 1
W0712 08:48:39.803] I0712 08:48:39.711676   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921316-349", Name:"scale-1-7bc75f7887", UID:"8f783d1f-9f2e-4ed0-8fbb-0bfac9831bdd", APIVersion:"apps/v1", ResourceVersion:"2335", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-1-7bc75f7887-hmw9p
W0712 08:48:39.827] E0712 08:48:39.826728   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:39.881] I0712 08:48:39.880498   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921316-349", Name:"scale-2", UID:"7b631721-53d4-47bf-9fc8-16c4ef7f3e49", APIVersion:"apps/v1", ResourceVersion:"2344", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-2-7bc75f7887 to 1
W0712 08:48:39.885] I0712 08:48:39.885188   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921316-349", Name:"scale-2-7bc75f7887", UID:"9ae0d730-d9cc-4678-a0cb-17e6b7274fec", APIVersion:"apps/v1", ResourceVersion:"2345", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-2-7bc75f7887-5kxq5
W0712 08:48:39.901] E0712 08:48:39.901075   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:39.992] E0712 08:48:39.991973   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:40.044] I0712 08:48:40.044072   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921316-349", Name:"scale-3", UID:"663f3dca-1fb5-42c8-9465-909cda655ba6", APIVersion:"apps/v1", ResourceVersion:"2354", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-3-7bc75f7887 to 1
W0712 08:48:40.050] I0712 08:48:40.049875   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921316-349", Name:"scale-3-7bc75f7887", UID:"3847419f-128c-426b-9d9e-aa008e64d276", APIVersion:"apps/v1", ResourceVersion:"2355", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-7bc75f7887-75zn6
W0712 08:48:40.087] E0712 08:48:40.086551   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:48:40.187] deployment.apps/scale-2 created
I0712 08:48:40.188] deployment.apps/scale-3 created
I0712 08:48:40.188] apps.sh:576: Successful get deploy scale-1 {{.spec.replicas}}: 1
I0712 08:48:40.235] apps.sh:577: Successful get deploy scale-2 {{.spec.replicas}}: 1
I0712 08:48:40.329] apps.sh:578: Successful get deploy scale-3 {{.spec.replicas}}: 1
I0712 08:48:40.423] deployment.apps/scale-1 scaled
... skipping 8 lines ...
W0712 08:48:40.934] I0712 08:48:40.435815   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921316-349", Name:"scale-1-7bc75f7887", UID:"8f783d1f-9f2e-4ed0-8fbb-0bfac9831bdd", APIVersion:"apps/v1", ResourceVersion:"2365", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-1-7bc75f7887-kqh49
W0712 08:48:40.934] I0712 08:48:40.445225   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921316-349", Name:"scale-2", UID:"7b631721-53d4-47bf-9fc8-16c4ef7f3e49", APIVersion:"apps/v1", ResourceVersion:"2366", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-2-7bc75f7887 to 2
W0712 08:48:40.935] I0712 08:48:40.447789   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921316-349", Name:"scale-2-7bc75f7887", UID:"9ae0d730-d9cc-4678-a0cb-17e6b7274fec", APIVersion:"apps/v1", ResourceVersion:"2372", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-2-7bc75f7887-j8bbg
W0712 08:48:40.935] I0712 08:48:40.717360   51895 horizontal.go:341] Horizontal Pod Autoscaler nginx-deployment has been deleted in namespace-1562921302-5904
W0712 08:48:40.935] I0712 08:48:40.816753   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921316-349", Name:"scale-1", UID:"d5c70243-85d8-4993-8f97-4b45bc297d42", APIVersion:"apps/v1", ResourceVersion:"2384", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-1-7bc75f7887 to 3
W0712 08:48:40.936] I0712 08:48:40.824909   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921316-349", Name:"scale-1-7bc75f7887", UID:"8f783d1f-9f2e-4ed0-8fbb-0bfac9831bdd", APIVersion:"apps/v1", ResourceVersion:"2385", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-1-7bc75f7887-cj4fr
W0712 08:48:40.936] E0712 08:48:40.828679   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:40.936] I0712 08:48:40.839839   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921316-349", Name:"scale-2", UID:"7b631721-53d4-47bf-9fc8-16c4ef7f3e49", APIVersion:"apps/v1", ResourceVersion:"2386", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-2-7bc75f7887 to 3
W0712 08:48:40.936] I0712 08:48:40.852913   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921316-349", Name:"scale-2-7bc75f7887", UID:"9ae0d730-d9cc-4678-a0cb-17e6b7274fec", APIVersion:"apps/v1", ResourceVersion:"2393", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-2-7bc75f7887-7b7m6
W0712 08:48:40.937] I0712 08:48:40.855415   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921316-349", Name:"scale-3", UID:"663f3dca-1fb5-42c8-9465-909cda655ba6", APIVersion:"apps/v1", ResourceVersion:"2392", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-3-7bc75f7887 to 3
W0712 08:48:40.937] I0712 08:48:40.876571   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921316-349", Name:"scale-3-7bc75f7887", UID:"3847419f-128c-426b-9d9e-aa008e64d276", APIVersion:"apps/v1", ResourceVersion:"2397", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-7bc75f7887-mlsz9
W0712 08:48:40.937] I0712 08:48:40.888387   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921316-349", Name:"scale-3-7bc75f7887", UID:"3847419f-128c-426b-9d9e-aa008e64d276", APIVersion:"apps/v1", ResourceVersion:"2397", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-7bc75f7887-89jbk
W0712 08:48:40.938] E0712 08:48:40.902602   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:40.994] E0712 08:48:40.993510   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:41.088] E0712 08:48:41.087994   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:48:41.189] apps.sh:586: Successful get deploy scale-1 {{.spec.replicas}}: 3
I0712 08:48:41.189] apps.sh:587: Successful get deploy scale-2 {{.spec.replicas}}: 3
I0712 08:48:41.189] apps.sh:588: Successful get deploy scale-3 {{.spec.replicas}}: 3
I0712 08:48:41.251] replicaset.apps "frontend" deleted
I0712 08:48:41.331] deployment.apps "scale-1" deleted
I0712 08:48:41.335] deployment.apps "scale-2" deleted
... skipping 18 lines ...
I0712 08:48:42.914] apps.sh:624: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:48:43.005] apps.sh:628: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:48:43.162] replicaset.apps/frontend created
W0712 08:48:43.263] I0712 08:48:41.515508   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921316-349", Name:"frontend", UID:"9929015d-a32d-4783-88ff-46756497e9ae", APIVersion:"apps/v1", ResourceVersion:"2446", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-85bjh
W0712 08:48:43.263] I0712 08:48:41.519677   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921316-349", Name:"frontend", UID:"9929015d-a32d-4783-88ff-46756497e9ae", APIVersion:"apps/v1", ResourceVersion:"2446", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-9bfwd
W0712 08:48:43.263] I0712 08:48:41.520059   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921316-349", Name:"frontend", UID:"9929015d-a32d-4783-88ff-46756497e9ae", APIVersion:"apps/v1", ResourceVersion:"2446", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-zzsnm
W0712 08:48:43.264] E0712 08:48:41.830365   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:43.264] E0712 08:48:41.904372   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:43.264] E0712 08:48:41.995081   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:43.264] E0712 08:48:42.089202   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:43.265] E0712 08:48:42.834466   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:43.265] E0712 08:48:42.905964   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:43.265] E0712 08:48:42.996123   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:43.265] E0712 08:48:43.091025   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:43.265] I0712 08:48:43.168916   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921316-349", Name:"frontend", UID:"0a9ebd64-908f-4592-a79b-ebe439e1489e", APIVersion:"apps/v1", ResourceVersion:"2482", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-zcvjt
W0712 08:48:43.266] I0712 08:48:43.174360   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921316-349", Name:"frontend", UID:"0a9ebd64-908f-4592-a79b-ebe439e1489e", APIVersion:"apps/v1", ResourceVersion:"2482", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-57rvs
W0712 08:48:43.266] I0712 08:48:43.175183   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921316-349", Name:"frontend", UID:"0a9ebd64-908f-4592-a79b-ebe439e1489e", APIVersion:"apps/v1", ResourceVersion:"2482", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-kph8c
W0712 08:48:43.319] I0712 08:48:43.318653   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921316-349", Name:"redis-slave", UID:"f9a91d77-940c-429a-8a38-d94b343f3069", APIVersion:"apps/v1", ResourceVersion:"2491", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-cxlmc
W0712 08:48:43.323] I0712 08:48:43.322828   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921316-349", Name:"redis-slave", UID:"f9a91d77-940c-429a-8a38-d94b343f3069", APIVersion:"apps/v1", ResourceVersion:"2491", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-wzfxt
I0712 08:48:43.424] replicaset.apps/redis-slave created
... skipping 8 lines ...
I0712 08:48:44.055] horizontalpodautoscaler.autoscaling/frontend autoscaled
I0712 08:48:44.141] apps.sh:652: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 70
I0712 08:48:44.221] horizontalpodautoscaler.autoscaling "frontend" deleted
I0712 08:48:44.303] horizontalpodautoscaler.autoscaling/frontend autoscaled
I0712 08:48:44.387] apps.sh:656: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
I0712 08:48:44.457] horizontalpodautoscaler.autoscaling "frontend" deleted
W0712 08:48:44.558] E0712 08:48:43.836083   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:44.558] I0712 08:48:43.892965   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921316-349", Name:"frontend", UID:"26768ce1-2c90-40d3-96bc-90399cafd95d", APIVersion:"apps/v1", ResourceVersion:"2510", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-fp579
W0712 08:48:44.558] I0712 08:48:43.897589   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921316-349", Name:"frontend", UID:"26768ce1-2c90-40d3-96bc-90399cafd95d", APIVersion:"apps/v1", ResourceVersion:"2510", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-gf8vt
W0712 08:48:44.559] I0712 08:48:43.897916   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921316-349", Name:"frontend", UID:"26768ce1-2c90-40d3-96bc-90399cafd95d", APIVersion:"apps/v1", ResourceVersion:"2510", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-z87ck
W0712 08:48:44.559] E0712 08:48:43.910136   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:44.559] E0712 08:48:43.997654   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:44.559] E0712 08:48:44.092518   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:44.559] Error: required flag(s) "max" not set
W0712 08:48:44.559] 
W0712 08:48:44.560] 
W0712 08:48:44.560] Examples:
W0712 08:48:44.560]   # Auto scale a deployment "foo", with the number of pods between 2 and 10, no target CPU utilization specified so a default autoscaling policy will be used:
W0712 08:48:44.560]   kubectl autoscale deployment foo --min=2 --max=10
W0712 08:48:44.560]   
... skipping 87 lines ...
I0712 08:48:47.478] apps.sh:436: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
I0712 08:48:47.564] apps.sh:437: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
I0712 08:48:47.660] statefulset.apps/nginx rolled back
I0712 08:48:47.756] apps.sh:440: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
I0712 08:48:47.842] apps.sh:441: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I0712 08:48:47.934] Successful
I0712 08:48:47.935] message:error: unable to find specified revision 1000000 in history
I0712 08:48:47.935] has:unable to find specified revision
I0712 08:48:48.024] apps.sh:445: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
I0712 08:48:48.107] apps.sh:446: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I0712 08:48:48.196] statefulset.apps/nginx rolled back
I0712 08:48:48.282] apps.sh:449: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
I0712 08:48:48.369] apps.sh:450: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
... skipping 7 lines ...
I0712 08:48:48.684] +++ working dir: /go/src/k8s.io/kubernetes
I0712 08:48:48.686] +++ command: run_lists_tests
I0712 08:48:48.699] +++ [0712 08:48:48] Creating namespace namespace-1562921328-13607
I0712 08:48:48.763] namespace/namespace-1562921328-13607 created
I0712 08:48:48.829] Context "test" modified.
I0712 08:48:48.836] +++ [0712 08:48:48] Testing kubectl(v1:lists)
W0712 08:48:48.937] E0712 08:48:44.837393   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:48.937] E0712 08:48:44.911514   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:48.938] E0712 08:48:44.998988   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:48.938] I0712 08:48:45.056271   48560 controller.go:606] quota admission added evaluator for: statefulsets.apps
W0712 08:48:48.938] E0712 08:48:45.094403   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:48.939] I0712 08:48:45.340117   51895 event.go:255] Event(v1.ObjectReference{Kind:"StatefulSet", Namespace:"namespace-1562921324-22191", Name:"nginx", UID:"485b2da2-f348-4e3e-8928-db8f4fe4461c", APIVersion:"apps/v1", ResourceVersion:"2536", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' create Pod nginx-0 in StatefulSet nginx successful
W0712 08:48:48.939] E0712 08:48:45.838674   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:48.939] I0712 08:48:45.844176   51895 stateful_set.go:420] StatefulSet has been deleted namespace-1562921324-22191/nginx
W0712 08:48:48.939] E0712 08:48:45.912807   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:48.940] E0712 08:48:46.000368   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:48.940] E0712 08:48:46.095857   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:48.940] E0712 08:48:46.840381   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:48.941] E0712 08:48:46.914857   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:48.941] E0712 08:48:47.001928   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:48.941] E0712 08:48:47.097336   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:48.942] E0712 08:48:47.841568   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:48.942] E0712 08:48:47.916587   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:48.942] E0712 08:48:48.003230   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:48.943] E0712 08:48:48.098555   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:48.943] I0712 08:48:48.525704   51895 stateful_set.go:420] StatefulSet has been deleted namespace-1562921326-18619/nginx
W0712 08:48:48.943] E0712 08:48:48.843004   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:48.943] E0712 08:48:48.918107   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:48.989] I0712 08:48:48.989135   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921328-13607", Name:"list-deployment-test", UID:"7d50d8ad-54b0-49a2-83c3-b2d40e2a2457", APIVersion:"apps/v1", ResourceVersion:"2571", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set list-deployment-test-5ffdc4bc8b to 1
W0712 08:48:48.995] I0712 08:48:48.994508   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921328-13607", Name:"list-deployment-test-5ffdc4bc8b", UID:"e599248c-3936-4498-ab69-2ba2e154a771", APIVersion:"apps/v1", ResourceVersion:"2572", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: list-deployment-test-5ffdc4bc8b-bkrnx
W0712 08:48:49.004] E0712 08:48:49.004246   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:49.100] E0712 08:48:49.099686   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:48:49.201] service/list-service-test created
I0712 08:48:49.201] deployment.apps/list-deployment-test created
I0712 08:48:49.201] service "list-service-test" deleted
I0712 08:48:49.201] deployment.apps "list-deployment-test" deleted
I0712 08:48:49.202] +++ exit code: 0
I0712 08:48:49.202] Recording: run_multi_resources_tests
... skipping 16 lines ...
I0712 08:48:49.843] NAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
I0712 08:48:49.844] service/mock   ClusterIP   10.0.0.241   <none>        99/TCP    0s
I0712 08:48:49.844] NAME                         DESIRED   CURRENT   READY   AGE
I0712 08:48:49.845] replicationcontroller/mock   1         1         0       0s
W0712 08:48:49.945] I0712 08:48:49.611474   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921329-3061", Name:"mock", UID:"1636c1b0-f7e8-4fa0-bcb7-586b315037f6", APIVersion:"v1", ResourceVersion:"2594", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-8p6c7
W0712 08:48:49.946] 
W0712 08:48:49.946] E0712 08:48:49.843783   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:49.947] E0712 08:48:49.919031   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:50.005] E0712 08:48:50.005118   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:50.101] E0712 08:48:50.101131   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:50.171] I0712 08:48:50.170391   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921329-3061", Name:"mock", UID:"0291a19e-4acb-491b-8cf8-cfc2874a7078", APIVersion:"v1", ResourceVersion:"2608", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-xwgb6
I0712 08:48:50.271] Name:              mock
I0712 08:48:50.272] Namespace:         namespace-1562921329-3061
I0712 08:48:50.272] Labels:            app=mock
I0712 08:48:50.273] Annotations:       <none>
I0712 08:48:50.273] Selector:          app=mock
... skipping 9 lines ...
I0712 08:48:50.277] Name:         mock
I0712 08:48:50.277] Namespace:    namespace-1562921329-3061
I0712 08:48:50.278] Selector:     app=mock
I0712 08:48:50.278] Labels:       app=mock
I0712 08:48:50.278] Annotations:  <none>
I0712 08:48:50.278] Replicas:     1 current / 1 desired
I0712 08:48:50.279] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0712 08:48:50.279] Pod Template:
I0712 08:48:50.280]   Labels:  app=mock
I0712 08:48:50.280]   Containers:
I0712 08:48:50.280]    mock-container:
I0712 08:48:50.281]     Image:        k8s.gcr.io/pause:2.0
I0712 08:48:50.281]     Port:         9949/TCP
... skipping 33 lines ...
I0712 08:48:51.681] generic-resources.sh:72: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
I0712 08:48:51.758] generic-resources.sh:80: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
I0712 08:48:51.826] NAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
I0712 08:48:51.827] service/mock   ClusterIP   10.0.0.93    <none>        99/TCP    0s
I0712 08:48:51.827] NAME                         DESIRED   CURRENT   READY   AGE
I0712 08:48:51.827] replicationcontroller/mock   1         1         0       0s
W0712 08:48:51.928] E0712 08:48:50.844864   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:51.928] E0712 08:48:50.920641   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:51.929] E0712 08:48:51.006251   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:51.929] E0712 08:48:51.102872   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:51.929] I0712 08:48:51.597093   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921329-3061", Name:"mock", UID:"2e38a1bc-26fd-4a08-b4f0-4175ec6dcf23", APIVersion:"v1", ResourceVersion:"2632", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-7p8vf
W0712 08:48:51.930] 
W0712 08:48:51.930] E0712 08:48:51.846115   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:51.930] E0712 08:48:51.921971   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:52.008] E0712 08:48:52.007583   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:52.105] E0712 08:48:52.104317   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:52.174] I0712 08:48:52.173508   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921329-3061", Name:"mock", UID:"72580a30-183a-4390-8009-758ae5fcbe68", APIVersion:"v1", ResourceVersion:"2646", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-kzhq8
I0712 08:48:52.275] Name:              mock
I0712 08:48:52.275] Namespace:         namespace-1562921329-3061
I0712 08:48:52.275] Labels:            app=mock
I0712 08:48:52.275] Annotations:       <none>
I0712 08:48:52.275] Selector:          app=mock
... skipping 9 lines ...
I0712 08:48:52.276] Name:         mock
I0712 08:48:52.276] Namespace:    namespace-1562921329-3061
I0712 08:48:52.277] Selector:     app=mock
I0712 08:48:52.277] Labels:       app=mock
I0712 08:48:52.277] Annotations:  <none>
I0712 08:48:52.277] Replicas:     1 current / 1 desired
I0712 08:48:52.277] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0712 08:48:52.277] Pod Template:
I0712 08:48:52.277]   Labels:  app=mock
I0712 08:48:52.277]   Containers:
I0712 08:48:52.277]    mock-container:
I0712 08:48:52.277]     Image:        k8s.gcr.io/pause:2.0
I0712 08:48:52.277]     Port:         9949/TCP
... skipping 33 lines ...
I0712 08:48:53.738] generic-resources.sh:72: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
I0712 08:48:53.816] generic-resources.sh:80: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
I0712 08:48:53.892] NAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
I0712 08:48:53.892] service/mock   ClusterIP   10.0.0.67    <none>        99/TCP    0s
I0712 08:48:53.893] NAME                         DESIRED   CURRENT   READY   AGE
I0712 08:48:53.893] replicationcontroller/mock   1         1         0       0s
W0712 08:48:53.994] E0712 08:48:52.847303   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:53.994] E0712 08:48:52.923083   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:53.994] E0712 08:48:53.008979   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:53.994] E0712 08:48:53.105950   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:53.995] I0712 08:48:53.646852   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921329-3061", Name:"mock", UID:"e9a17997-6fd0-4fae-a46b-6fb7207251d7", APIVersion:"v1", ResourceVersion:"2670", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-6j4dt
W0712 08:48:53.995] E0712 08:48:53.848580   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:53.995] 
W0712 08:48:53.995] E0712 08:48:53.924712   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:54.010] E0712 08:48:54.010207   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:54.108] E0712 08:48:54.108152   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:48:54.209] Name:              mock
I0712 08:48:54.209] Namespace:         namespace-1562921329-3061
I0712 08:48:54.209] Labels:            app=mock
I0712 08:48:54.209] Annotations:       <none>
I0712 08:48:54.209] Selector:          app=mock
I0712 08:48:54.210] Type:              ClusterIP
... skipping 8 lines ...
I0712 08:48:54.210] Name:         mock
I0712 08:48:54.210] Namespace:    namespace-1562921329-3061
I0712 08:48:54.210] Selector:     app=mock
I0712 08:48:54.210] Labels:       app=mock
I0712 08:48:54.211] Annotations:  <none>
I0712 08:48:54.211] Replicas:     1 current / 1 desired
I0712 08:48:54.211] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0712 08:48:54.211] Pod Template:
I0712 08:48:54.211]   Labels:  app=mock
I0712 08:48:54.211]   Containers:
I0712 08:48:54.211]    mock-container:
I0712 08:48:54.211]     Image:        k8s.gcr.io/pause:2.0
I0712 08:48:54.211]     Port:         9949/TCP
... skipping 32 lines ...
I0712 08:48:55.696] replicationcontroller/mock2 created
I0712 08:48:55.800] generic-resources.sh:78: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:mock2:
I0712 08:48:55.869] NAME    DESIRED   CURRENT   READY   AGE
I0712 08:48:55.870] mock    1         1         0       0s
I0712 08:48:55.870] mock2   1         1         0       0s
W0712 08:48:55.971] I0712 08:48:54.238681   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921329-3061", Name:"mock", UID:"462a0109-e580-4006-b76f-ae185e1a10cc", APIVersion:"v1", ResourceVersion:"2684", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-zzkjc
W0712 08:48:55.971] E0712 08:48:54.849990   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:55.971] E0712 08:48:54.926148   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:55.971] E0712 08:48:55.011503   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:55.972] E0712 08:48:55.109851   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:55.972] I0712 08:48:55.698783   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921329-3061", Name:"mock", UID:"d17d0b71-f817-4fc2-ab02-a164b6bea7e8", APIVersion:"v1", ResourceVersion:"2704", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-rv6qc
W0712 08:48:55.972] I0712 08:48:55.702071   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921329-3061", Name:"mock2", UID:"e5bfe399-040e-4563-84ba-2766ea54a248", APIVersion:"v1", ResourceVersion:"2705", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock2-p7dm7
W0712 08:48:55.972] E0712 08:48:55.851615   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:55.973] E0712 08:48:55.927614   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:56.013] E0712 08:48:56.012850   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:56.112] E0712 08:48:56.111583   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:48:56.213] Name:         mock
I0712 08:48:56.213] Namespace:    namespace-1562921329-3061
I0712 08:48:56.214] Selector:     app=mock
I0712 08:48:56.214] Labels:       app=mock
I0712 08:48:56.214]               status=replaced
I0712 08:48:56.215] Annotations:  <none>
I0712 08:48:56.215] Replicas:     1 current / 1 desired
I0712 08:48:56.215] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0712 08:48:56.215] Pod Template:
I0712 08:48:56.216]   Labels:  app=mock
I0712 08:48:56.216]   Containers:
I0712 08:48:56.216]    mock-container:
I0712 08:48:56.216]     Image:        k8s.gcr.io/pause:2.0
I0712 08:48:56.216]     Port:         9949/TCP
... skipping 11 lines ...
I0712 08:48:56.219] Namespace:    namespace-1562921329-3061
I0712 08:48:56.220] Selector:     app=mock2
I0712 08:48:56.220] Labels:       app=mock2
I0712 08:48:56.220]               status=replaced
I0712 08:48:56.220] Annotations:  <none>
I0712 08:48:56.221] Replicas:     1 current / 1 desired
I0712 08:48:56.221] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0712 08:48:56.221] Pod Template:
I0712 08:48:56.221]   Labels:  app=mock2
I0712 08:48:56.222]   Containers:
I0712 08:48:56.222]    mock-container:
I0712 08:48:56.222]     Image:        k8s.gcr.io/pause:2.0
I0712 08:48:56.222]     Port:         9949/TCP
... skipping 33 lines ...
I0712 08:48:57.818] generic-resources.sh:70: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:mock2:
I0712 08:48:57.896] NAME    TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
I0712 08:48:57.897] mock    ClusterIP   10.0.0.64    <none>        99/TCP    0s
I0712 08:48:57.897] mock2   ClusterIP   10.0.0.129   <none>        99/TCP    0s
W0712 08:48:57.998] I0712 08:48:56.221623   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921329-3061", Name:"mock", UID:"9a80e5a2-e232-4f45-8b8b-4586851752d9", APIVersion:"v1", ResourceVersion:"2720", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-nm47q
W0712 08:48:57.998] I0712 08:48:56.226975   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921329-3061", Name:"mock2", UID:"a0b3405b-84eb-4e58-8f75-c2b8c6ca6b0c", APIVersion:"v1", ResourceVersion:"2722", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock2-g5k5v
W0712 08:48:57.999] E0712 08:48:56.852642   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:57.999] E0712 08:48:56.928758   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:57.999] E0712 08:48:57.013904   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:58.000] E0712 08:48:57.112678   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:58.000] E0712 08:48:57.854154   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:58.000] E0712 08:48:57.930205   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:58.016] E0712 08:48:58.015272   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:48:58.114] E0712 08:48:58.114005   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:48:58.215] Name:              mock
I0712 08:48:58.215] Namespace:         namespace-1562921329-3061
I0712 08:48:58.216] Labels:            app=mock
I0712 08:48:58.216] Annotations:       <none>
I0712 08:48:58.216] Selector:          app=mock
I0712 08:48:58.216] Type:              ClusterIP
... skipping 59 lines ...
I0712 08:49:00.905] Context "test" modified.
I0712 08:49:00.913] +++ [0712 08:49:00] Testing persistent volumes
I0712 08:49:01.004] storage.sh:30: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:49:01.161] persistentvolume/pv0001 created
I0712 08:49:01.258] storage.sh:33: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0001:
I0712 08:49:01.336] persistentvolume "pv0001" deleted
W0712 08:49:01.436] E0712 08:48:58.855593   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:01.437] E0712 08:48:58.931194   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:01.437] E0712 08:48:59.016327   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:01.437] I0712 08:48:59.054119   51895 horizontal.go:341] Horizontal Pod Autoscaler frontend has been deleted in namespace-1562921316-349
W0712 08:49:01.438] E0712 08:48:59.115495   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:01.438] E0712 08:48:59.857003   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:01.438] E0712 08:48:59.932652   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:01.438] E0712 08:49:00.018019   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:01.438] I0712 08:49:00.065561   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921329-3061", Name:"mock", UID:"b345233c-42fd-4c02-bc05-f073012440cc", APIVersion:"v1", ResourceVersion:"2781", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-f2tc9
W0712 08:49:01.439] E0712 08:49:00.116879   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:01.439] E0712 08:49:00.858608   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:01.439] E0712 08:49:00.933812   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:01.439] E0712 08:49:01.019490   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:01.439] E0712 08:49:01.118291   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:01.440] E0712 08:49:01.169779   51895 pv_protection_controller.go:117] PV pv0001 failed with : Operation cannot be fulfilled on persistentvolumes "pv0001": the object has been modified; please apply your changes to the latest version and try again
I0712 08:49:01.540] persistentvolume/pv0002 created
I0712 08:49:01.617] storage.sh:36: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0002:
I0712 08:49:01.696] persistentvolume "pv0002" deleted
W0712 08:49:01.861] E0712 08:49:01.860348   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:01.936] E0712 08:49:01.935441   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:02.021] E0712 08:49:02.021074   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:02.120] E0712 08:49:02.119747   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:49:02.221] persistentvolume/pv0003 created
I0712 08:49:02.221] storage.sh:39: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0003:
I0712 08:49:02.221] persistentvolume "pv0003" deleted
I0712 08:49:02.221] storage.sh:42: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:49:02.301] persistentvolume/pv0001 created
I0712 08:49:02.402] storage.sh:45: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0001:
... skipping 18 lines ...
I0712 08:49:02.832] Context "test" modified.
I0712 08:49:02.839] +++ [0712 08:49:02] Testing persistent volumes claims
I0712 08:49:02.930] storage.sh:64: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:49:03.088] persistentvolumeclaim/myclaim-1 created
I0712 08:49:03.188] storage.sh:67: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-1:
I0712 08:49:03.269] persistentvolumeclaim "myclaim-1" deleted
W0712 08:49:03.370] E0712 08:49:02.305484   51895 pv_protection_controller.go:117] PV pv0001 failed with : Operation cannot be fulfilled on persistentvolumes "pv0001": the object has been modified; please apply your changes to the latest version and try again
W0712 08:49:03.370] E0712 08:49:02.861908   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:03.371] E0712 08:49:02.937216   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:03.371] E0712 08:49:03.022554   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:03.372] I0712 08:49:03.090035   51895 event.go:255] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1562921342-1076", Name:"myclaim-1", UID:"0b927c01-f6ac-41e6-9808-4f910f79c84f", APIVersion:"v1", ResourceVersion:"2818", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
W0712 08:49:03.372] I0712 08:49:03.094284   51895 event.go:255] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1562921342-1076", Name:"myclaim-1", UID:"0b927c01-f6ac-41e6-9808-4f910f79c84f", APIVersion:"v1", ResourceVersion:"2820", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
W0712 08:49:03.372] E0712 08:49:03.121471   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:03.373] I0712 08:49:03.269845   51895 event.go:255] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1562921342-1076", Name:"myclaim-1", UID:"0b927c01-f6ac-41e6-9808-4f910f79c84f", APIVersion:"v1", ResourceVersion:"2823", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
W0712 08:49:03.427] I0712 08:49:03.426475   51895 event.go:255] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1562921342-1076", Name:"myclaim-2", UID:"267b6c97-0514-428d-b3c3-41b0abf95841", APIVersion:"v1", ResourceVersion:"2826", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
W0712 08:49:03.432] I0712 08:49:03.431695   51895 event.go:255] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1562921342-1076", Name:"myclaim-2", UID:"267b6c97-0514-428d-b3c3-41b0abf95841", APIVersion:"v1", ResourceVersion:"2828", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
I0712 08:49:03.533] persistentvolumeclaim/myclaim-2 created
I0712 08:49:03.533] storage.sh:71: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-2:
I0712 08:49:03.598] persistentvolumeclaim "myclaim-2" deleted
... skipping 151 lines ...
I0712 08:49:05.295]   Resource           Requests  Limits
I0712 08:49:05.295]   --------           --------  ------
I0712 08:49:05.295]   cpu                0 (0%)    0 (0%)
I0712 08:49:05.295]   memory             0 (0%)    0 (0%)
I0712 08:49:05.295]   ephemeral-storage  0 (0%)    0 (0%)
I0712 08:49:05.295] 
W0712 08:49:05.396] E0712 08:49:03.865212   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:05.396] E0712 08:49:03.938747   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:05.397] I0712 08:49:03.945006   51895 event.go:255] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1562921342-1076", Name:"myclaim-3", UID:"ed764722-2b30-4a18-92a3-31433c237aa9", APIVersion:"v1", ResourceVersion:"2837", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
W0712 08:49:05.397] E0712 08:49:04.023967   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:05.397] E0712 08:49:04.123165   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:05.397] E0712 08:49:04.866417   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:05.398] E0712 08:49:04.940294   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:05.398] E0712 08:49:05.025649   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:05.398] E0712 08:49:05.124505   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:49:05.498] core.sh:1377: Successful describe
I0712 08:49:05.499] Name:               127.0.0.1
I0712 08:49:05.499] Roles:              <none>
I0712 08:49:05.499] Labels:             <none>
I0712 08:49:05.500] Annotations:        node.alpha.kubernetes.io/ttl: 0
I0712 08:49:05.500] CreationTimestamp:  Fri, 12 Jul 2019 08:44:52 +0000
... skipping 280 lines ...
I0712 08:49:06.830]   "status": {
I0712 08:49:06.830]     "allowed": true,
I0712 08:49:06.830]     "reason": "RBAC: allowed by ClusterRoleBinding \"super-group\" of ClusterRole \"admin\" to Group \"the-group\""
I0712 08:49:06.830]   }
I0712 08:49:06.830] }
I0712 08:49:06.842] +++ exit code: 0
W0712 08:49:06.942] E0712 08:49:05.867736   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:06.943] E0712 08:49:05.941772   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:06.943] E0712 08:49:06.027558   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:06.943] E0712 08:49:06.126013   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:06.943]   % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
W0712 08:49:06.943]                                  Dload  Upload   Total   Spent    Left  Speed
W0712 08:49:06.943] 
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100  1170  100   868  100   302   169k  60315 --:--:-- --:--:-- --:--:--  211k
W0712 08:49:06.944]   % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
W0712 08:49:06.944]                                  Dload  Upload   Total   Spent    Left  Speed
W0712 08:49:06.944] 
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100  1158  100   860  100   298   156k  55690 --:--:-- --:--:-- --:--:--  167k
W0712 08:49:06.944] E0712 08:49:06.869491   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:06.944] E0712 08:49:06.943986   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:07.029] E0712 08:49:07.028798   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:07.128] E0712 08:49:07.127265   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:49:07.228] Successful
I0712 08:49:07.229] message:yes
I0712 08:49:07.229] has:yes
I0712 08:49:07.229] Successful
I0712 08:49:07.229] message:yes
I0712 08:49:07.229] has:yes
... skipping 2 lines ...
I0712 08:49:07.229] yes
I0712 08:49:07.229] has:the server doesn't have a resource type
I0712 08:49:07.229] Successful
I0712 08:49:07.229] message:yes
I0712 08:49:07.229] has:yes
I0712 08:49:07.283] Successful
I0712 08:49:07.283] message:error: --subresource can not be used with NonResourceURL
I0712 08:49:07.283] has:subresource can not be used with NonResourceURL
I0712 08:49:07.361] Successful
I0712 08:49:07.442] Successful
I0712 08:49:07.442] message:yes
I0712 08:49:07.442] 0
I0712 08:49:07.442] has:0
... skipping 27 lines ...
I0712 08:49:08.058] role.rbac.authorization.k8s.io/testing-R reconciled
I0712 08:49:08.149] legacy-script.sh:798: Successful get rolebindings -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-RB:
I0712 08:49:08.234] legacy-script.sh:799: Successful get roles -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-R:
I0712 08:49:08.320] legacy-script.sh:800: Successful get clusterrolebindings -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CRB:
I0712 08:49:08.410] legacy-script.sh:801: Successful get clusterroles -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CR:
I0712 08:49:08.484] Successful
I0712 08:49:08.485] message:error: only rbac.authorization.k8s.io/v1 is supported: not *v1beta1.ClusterRole
I0712 08:49:08.486] has:only rbac.authorization.k8s.io/v1 is supported
I0712 08:49:08.573] rolebinding.rbac.authorization.k8s.io "testing-RB" deleted
I0712 08:49:08.578] role.rbac.authorization.k8s.io "testing-R" deleted
I0712 08:49:08.590] clusterrole.rbac.authorization.k8s.io "testing-CR" deleted
I0712 08:49:08.600] clusterrolebinding.rbac.authorization.k8s.io "testing-CRB" deleted
I0712 08:49:08.610] Recording: run_retrieve_multiple_tests
... skipping 13 lines ...
I0712 08:49:08.871] +++ working dir: /go/src/k8s.io/kubernetes
I0712 08:49:08.873] +++ command: run_resource_aliasing_tests
I0712 08:49:08.883] +++ [0712 08:49:08] Creating namespace namespace-1562921348-19185
I0712 08:49:08.950] namespace/namespace-1562921348-19185 created
I0712 08:49:09.015] Context "test" modified.
I0712 08:49:09.021] +++ [0712 08:49:09] Testing resource aliasing
W0712 08:49:09.122] E0712 08:49:07.870816   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:09.122] E0712 08:49:07.945311   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:09.122] E0712 08:49:08.030057   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:09.123] 	reconciliation required create
W0712 08:49:09.123] 	missing rules added:
W0712 08:49:09.123] 		{Verbs:[create delete deletecollection get list patch update watch] APIGroups:[] Resources:[pods] ResourceNames:[] NonResourceURLs:[]}
W0712 08:49:09.123] 	reconciliation required create
W0712 08:49:09.123] 	missing subjects added:
W0712 08:49:09.123] 		{Kind:Group APIGroup:rbac.authorization.k8s.io Name:system:masters Namespace:}
W0712 08:49:09.123] 	reconciliation required create
W0712 08:49:09.124] 	missing subjects added:
W0712 08:49:09.124] 		{Kind:Group APIGroup:rbac.authorization.k8s.io Name:system:masters Namespace:}
W0712 08:49:09.124] 	reconciliation required create
W0712 08:49:09.124] 	missing rules added:
W0712 08:49:09.124] 		{Verbs:[get list watch] APIGroups:[] Resources:[configmaps] ResourceNames:[] NonResourceURLs:[]}
W0712 08:49:09.124] E0712 08:49:08.128670   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:09.125] warning: deleting cluster-scoped resources, not scoped to the provided namespace
W0712 08:49:09.125] E0712 08:49:08.872151   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:09.125] E0712 08:49:08.946505   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:09.125] E0712 08:49:09.031284   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:09.130] E0712 08:49:09.130021   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:09.182] I0712 08:49:09.181398   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921348-19185", Name:"cassandra", UID:"4b858786-53c6-44c5-ba00-2ee59b862c10", APIVersion:"v1", ResourceVersion:"2860", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-nffnb
W0712 08:49:09.189] I0712 08:49:09.188437   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921348-19185", Name:"cassandra", UID:"4b858786-53c6-44c5-ba00-2ee59b862c10", APIVersion:"v1", ResourceVersion:"2860", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-sbhgl
I0712 08:49:09.289] replicationcontroller/cassandra created
I0712 08:49:09.336] service/cassandra created
I0712 08:49:09.463] discovery.sh:89: Successful get all -l'app=cassandra' {{range.items}}{{range .metadata.labels}}{{.}}:{{end}}{{end}}: cassandra:cassandra:cassandra:cassandra:
I0712 08:49:09.557] pod "cassandra-nffnb" deleted
... skipping 7 lines ...
I0712 08:49:09.673] +++ Running case: test-cmd.run_kubectl_explain_tests 
I0712 08:49:09.675] +++ working dir: /go/src/k8s.io/kubernetes
I0712 08:49:09.677] +++ command: run_kubectl_explain_tests
I0712 08:49:09.686] +++ [0712 08:49:09] Testing kubectl(v1:explain)
W0712 08:49:09.787] I0712 08:49:09.562223   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921348-19185", Name:"cassandra", UID:"4b858786-53c6-44c5-ba00-2ee59b862c10", APIVersion:"v1", ResourceVersion:"2866", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-9vppb
W0712 08:49:09.787] I0712 08:49:09.578424   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921348-19185", Name:"cassandra", UID:"4b858786-53c6-44c5-ba00-2ee59b862c10", APIVersion:"v1", ResourceVersion:"2866", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-lcrrk
W0712 08:49:09.788] E0712 08:49:09.582327   51895 replica_set.go:450] Sync "namespace-1562921348-19185/cassandra" failed with replicationcontrollers "cassandra" not found
W0712 08:49:09.874] E0712 08:49:09.873784   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:09.948] E0712 08:49:09.947963   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:10.033] E0712 08:49:10.032929   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:10.132] E0712 08:49:10.131334   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:49:10.232] KIND:     Pod
I0712 08:49:10.233] VERSION:  v1
I0712 08:49:10.233] 
I0712 08:49:10.233] DESCRIPTION:
I0712 08:49:10.233]      Pod is a collection of containers that can run on a host. This resource is
I0712 08:49:10.234]      created by clients and scheduled onto hosts.
... skipping 187 lines ...
I0712 08:49:12.979] I0712 08:49:12.964011   84205 request.go:968] Response Body: {"kind":"PodList","apiVersion":"v1","metadata":{"selfLink":"/api/v1/namespaces/namespace-1562921348-19185/pods","resourceVersion":"2893"},"items":[{"metadata":{"name":"sorted-pod1","namespace":"namespace-1562921348-19185","selfLink":"/api/v1/namespaces/namespace-1562921348-19185/pods/sorted-pod1","uid":"9dc9742a-0498-4773-8e22-807c7060d3cc","resourceVersion":"2890","creationTimestamp":"2019-07-12T08:49:11Z","labels":{"name":"sorted-pod3-label"},"managedFields":[{"manager":"kubectl","operation":"Update","apiVersion":"v1","time":"2019-07-12T08:49:11Z","fields":{"f:metadata":{"f:labels":{".":{},"f:name":{}}},"f:spec":{"f:containers":{"k:{\"name\":\"kubernetes-pause2\"}":{".":{},"f:image":{},"f:imagePullPolicy":{},"f:name":{},"f:resources":{},"f:terminationMessagePath":{},"f:terminationMessagePolicy":{}}},"f:dnsPolicy":{},"f:enableServiceLinks":{},"f:priority":{},"f:restartPolicy":{},"f:schedulerName":{},"f:securityContext":{},"f:terminationGracePeriodSeconds":{}}}}]},"spec":{"containers":[{"name":"kubernetes-pau [truncated 2974 chars]
I0712 08:49:12.979] NAME          AGE
I0712 08:49:12.979] sorted-pod2   0s
I0712 08:49:12.979] sorted-pod1   1s
I0712 08:49:12.979] sorted-pod3   0s
I0712 08:49:12.979] has not:Table
W0712 08:49:13.080] E0712 08:49:10.875247   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:13.081] E0712 08:49:10.949170   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:13.081] No resources found in namespace-1562921348-19185 namespace.
W0712 08:49:13.081] E0712 08:49:11.034311   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:13.081] No resources found in namespace-1562921348-19185 namespace.
W0712 08:49:13.082] E0712 08:49:11.132753   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:13.082] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0712 08:49:13.082] E0712 08:49:11.876455   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:13.082] E0712 08:49:11.950527   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:13.083] E0712 08:49:12.035473   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:13.083] E0712 08:49:12.134111   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:13.083] E0712 08:49:12.877673   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:13.084] E0712 08:49:12.951915   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:13.084] E0712 08:49:13.036821   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:13.137] E0712 08:49:13.136721   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:13.162] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
I0712 08:49:13.263] get.sh:329: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:sorted-pod2:sorted-pod3:
I0712 08:49:13.263] pod "sorted-pod1" force deleted
I0712 08:49:13.263] pod "sorted-pod2" force deleted
I0712 08:49:13.263] pod "sorted-pod3" force deleted
I0712 08:49:13.274] get.sh:333: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
... skipping 137 lines ...
I0712 08:49:14.343] namespace-1562921340-17505   default   0         14s
I0712 08:49:14.343] namespace-1562921342-1076    default   0         12s
I0712 08:49:14.343] namespace-1562921348-19185   default   0         6s
I0712 08:49:14.343] some-other-random            default   0         6s
I0712 08:49:14.343] has:all-ns-test-2
I0712 08:49:14.415] namespace "all-ns-test-1" deleted
W0712 08:49:14.516] E0712 08:49:13.879188   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:14.516] E0712 08:49:13.953051   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:14.516] E0712 08:49:14.038152   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:14.516] E0712 08:49:14.138087   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:14.882] E0712 08:49:14.881103   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:14.955] E0712 08:49:14.954730   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:15.040] E0712 08:49:15.039572   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:15.140] E0712 08:49:15.139710   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:15.883] E0712 08:49:15.882685   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:15.956] E0712 08:49:15.956170   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:16.041] E0712 08:49:16.041120   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:16.142] E0712 08:49:16.141840   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:16.884] E0712 08:49:16.884201   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:16.958] E0712 08:49:16.957909   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:17.043] E0712 08:49:17.043103   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:17.144] E0712 08:49:17.143364   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:17.886] E0712 08:49:17.885615   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:17.960] E0712 08:49:17.959456   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:18.045] E0712 08:49:18.044564   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:18.145] E0712 08:49:18.145180   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:18.887] E0712 08:49:18.887201   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:18.961] E0712 08:49:18.961071   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:19.046] E0712 08:49:19.046011   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:19.147] E0712 08:49:19.146877   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:49:19.587] namespace "all-ns-test-2" deleted
W0712 08:49:19.889] E0712 08:49:19.888935   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:19.963] E0712 08:49:19.962446   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:20.048] E0712 08:49:20.047696   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:20.149] E0712 08:49:20.148425   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:20.891] E0712 08:49:20.890743   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:20.965] E0712 08:49:20.964251   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:21.049] E0712 08:49:21.049234   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:21.150] E0712 08:49:21.149950   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:21.893] E0712 08:49:21.892237   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:21.966] E0712 08:49:21.965662   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:22.051] E0712 08:49:22.050468   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:22.152] E0712 08:49:22.151510   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:22.894] E0712 08:49:22.893650   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:22.968] E0712 08:49:22.967561   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:23.052] E0712 08:49:23.052233   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:23.154] E0712 08:49:23.153429   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:23.896] E0712 08:49:23.895317   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:23.969] E0712 08:49:23.969194   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:24.054] E0712 08:49:24.053953   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:24.155] E0712 08:49:24.154825   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:49:24.774] get.sh:380: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0712 08:49:24.856] pod "valid-pod" force deleted
I0712 08:49:24.949] get.sh:384: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0712 08:49:25.044] get.sh:388: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
I0712 08:49:25.131] Successful
I0712 08:49:25.131] message:NAME        STATUS     ROLES    AGE     VERSION
... skipping 125 lines ...
I0712 08:49:25.975] message:valid-pod:
I0712 08:49:25.976] has:valid-pod:
I0712 08:49:26.056] Successful
I0712 08:49:26.056] message:valid-pod:
I0712 08:49:26.056] has:valid-pod:
W0712 08:49:26.157] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0712 08:49:26.157] E0712 08:49:24.896777   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:26.157] E0712 08:49:24.970426   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:26.158] E0712 08:49:25.055573   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:26.158] E0712 08:49:25.156618   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:26.158] E0712 08:49:25.898101   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:26.159] E0712 08:49:25.972123   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:26.159] E0712 08:49:26.057314   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:26.160] E0712 08:49:26.158624   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:49:26.260] Successful
I0712 08:49:26.261] message:valid-pod:
I0712 08:49:26.261] has:valid-pod:
I0712 08:49:26.381] Successful
I0712 08:49:26.382] message:valid-pod:
I0712 08:49:26.382] has:valid-pod:
... skipping 13 lines ...
I0712 08:49:26.792] message:127.0.0.1:
I0712 08:49:26.792] has:127.0.0.1:
I0712 08:49:26.865] node/127.0.0.1 untainted
W0712 08:49:26.966] kubectl convert is DEPRECATED and will be removed in a future version.
W0712 08:49:26.966] In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
W0712 08:49:26.967] kubectl run --generator=job/v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
W0712 08:49:26.968] E0712 08:49:26.899494   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:26.974] E0712 08:49:26.973852   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:27.019] I0712 08:49:27.018729   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921365-12037", Name:"cassandra", UID:"20e9822d-ae62-456b-a700-7dc9cb45a26f", APIVersion:"v1", ResourceVersion:"2931", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-lk7n5
W0712 08:49:27.027] I0712 08:49:27.026553   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921365-12037", Name:"cassandra", UID:"20e9822d-ae62-456b-a700-7dc9cb45a26f", APIVersion:"v1", ResourceVersion:"2931", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-ft8jt
W0712 08:49:27.058] E0712 08:49:27.058162   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:49:27.159] replicationcontroller/cassandra created
I0712 08:49:27.160] Successful
I0712 08:49:27.160] message:cassandra:
I0712 08:49:27.161] has:cassandra:
I0712 08:49:27.178] Successful
I0712 08:49:27.178] message:testing-CR:testing-CRB:testing-RB:testing-R:
... skipping 7 lines ...
I0712 08:49:27.408] Successful
I0712 08:49:27.408] message:cm:
I0712 08:49:27.409] has:cm:
I0712 08:49:27.487] Successful
I0712 08:49:27.487] message:deploy:
I0712 08:49:27.487] has:deploy:
W0712 08:49:27.588] E0712 08:49:27.160100   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:27.588] 	reconciliation required create
W0712 08:49:27.588] 	missing rules added:
W0712 08:49:27.589] 		{Verbs:[create delete deletecollection get list patch update watch] APIGroups:[] Resources:[pods] ResourceNames:[] NonResourceURLs:[]}
W0712 08:49:27.589] 	reconciliation required create
W0712 08:49:27.589] 	missing subjects added:
W0712 08:49:27.589] 		{Kind:Group APIGroup:rbac.authorization.k8s.io Name:system:masters Namespace:}
... skipping 84 lines ...
I0712 08:49:29.359]   name: test
I0712 08:49:29.359] current-context: test
I0712 08:49:29.359] kind: Config
I0712 08:49:29.359] preferences: {}
I0712 08:49:29.359] users: []
I0712 08:49:29.359] has:kind: Config
W0712 08:49:29.460] E0712 08:49:27.901249   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:29.460] E0712 08:49:27.975398   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:29.460] E0712 08:49:28.059644   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:29.460] E0712 08:49:28.161784   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:29.461] E0712 08:49:28.902793   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:29.461] E0712 08:49:28.977083   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:29.461] E0712 08:49:29.061533   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:29.461] E0712 08:49:29.163107   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:49:29.562] Successful
I0712 08:49:29.562] message:deploy:
I0712 08:49:29.562] has:deploy:
I0712 08:49:29.562] Successful
I0712 08:49:29.562] message:deploy:
I0712 08:49:29.562] has:deploy:
... skipping 27 lines ...
I0712 08:49:30.563] Running command: run_certificates_tests
I0712 08:49:30.583] 
I0712 08:49:30.585] +++ Running case: test-cmd.run_certificates_tests 
I0712 08:49:30.587] +++ working dir: /go/src/k8s.io/kubernetes
I0712 08:49:30.589] +++ command: run_certificates_tests
I0712 08:49:30.599] +++ [0712 08:49:30] Testing certificates
W0712 08:49:30.700] E0712 08:49:29.904310   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:30.701] E0712 08:49:29.978582   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:30.701] E0712 08:49:30.063533   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:30.701] I0712 08:49:30.103624   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921365-12037", Name:"cassandra", UID:"20e9822d-ae62-456b-a700-7dc9cb45a26f", APIVersion:"v1", ResourceVersion:"2937", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-fvlmj
W0712 08:49:30.701] I0712 08:49:30.117957   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921365-12037", Name:"deploy-854b66754", UID:"f59c58d8-7c3a-434c-9d8a-e8e7f998c664", APIVersion:"apps/v1", ResourceVersion:"2948", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: deploy-854b66754-29bfb
W0712 08:49:30.702] I0712 08:49:30.123004   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1562921365-12037", Name:"cassandra", UID:"20e9822d-ae62-456b-a700-7dc9cb45a26f", APIVersion:"v1", ResourceVersion:"2967", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-kt7fq
W0712 08:49:30.702] E0712 08:49:30.164932   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:49:30.802] certificatesigningrequest.certificates.k8s.io/foo created
I0712 08:49:30.842] certificate.sh:29: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
I0712 08:49:30.912] certificatesigningrequest.certificates.k8s.io/foo approved
I0712 08:49:30.988] {
I0712 08:49:30.989]     "apiVersion": "v1",
I0712 08:49:30.989]     "items": [
... skipping 185 lines ...
I0712 08:49:32.376]         "resourceVersion": "",
I0712 08:49:32.376]         "selfLink": ""
I0712 08:49:32.376]     }
I0712 08:49:32.376] }
I0712 08:49:32.461] certificate.sh:49: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: Denied
I0712 08:49:32.543] certificatesigningrequest.certificates.k8s.io "foo" deleted
W0712 08:49:32.644] E0712 08:49:30.905941   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:32.644] E0712 08:49:30.979878   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:32.644] E0712 08:49:31.065005   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:32.645] E0712 08:49:31.166231   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:32.645] E0712 08:49:31.907432   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:32.645] E0712 08:49:31.981584   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:32.645] E0712 08:49:32.066758   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:32.646] E0712 08:49:32.167796   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:49:32.746] certificate.sh:51: Successful get csr {{range.items}}{{.metadata.name}}{{end}}: 
I0712 08:49:32.808] certificatesigningrequest.certificates.k8s.io/foo created
I0712 08:49:32.900] certificate.sh:54: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
I0712 08:49:32.982] certificatesigningrequest.certificates.k8s.io/foo approved
I0712 08:49:33.063] {
I0712 08:49:33.063]     "apiVersion": "v1",
... skipping 65 lines ...
I0712 08:49:33.402] +++ Running case: test-cmd.run_cluster_management_tests 
I0712 08:49:33.405] +++ working dir: /go/src/k8s.io/kubernetes
I0712 08:49:33.408] +++ command: run_cluster_management_tests
I0712 08:49:33.419] +++ [0712 08:49:33] Testing cluster-management commands
I0712 08:49:33.516] node-management.sh:27: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
I0712 08:49:33.664] pod/test-pod-1 created
W0712 08:49:33.766] E0712 08:49:32.909140   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:33.766] E0712 08:49:32.983116   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:33.767] E0712 08:49:33.069805   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:33.767] E0712 08:49:33.169452   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:49:33.868] pod/test-pod-2 created
I0712 08:49:33.895] node-management.sh:76: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: 
I0712 08:49:33.971] node/127.0.0.1 tainted
I0712 08:49:34.058] node-management.sh:79: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: dedicated=foo:PreferNoSchedule
I0712 08:49:34.136] node/127.0.0.1 untainted
I0712 08:49:34.222] node-management.sh:83: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: 
... skipping 18 lines ...
I0712 08:49:35.660] message:node/127.0.0.1 already uncordoned (dry run)
I0712 08:49:35.660] has:already uncordoned
I0712 08:49:35.747] node-management.sh:119: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
I0712 08:49:35.826] node/127.0.0.1 labeled
I0712 08:49:35.921] node-management.sh:124: Successful get nodes 127.0.0.1 {{.metadata.labels.test}}: label
I0712 08:49:35.992] Successful
I0712 08:49:35.993] message:error: cannot specify both a node name and a --selector option
I0712 08:49:35.993] See 'kubectl drain -h' for help and examples
I0712 08:49:35.994] has:cannot specify both a node name
I0712 08:49:36.066] Successful
I0712 08:49:36.067] message:error: USAGE: cordon NODE [flags]
I0712 08:49:36.068] See 'kubectl cordon -h' for help and examples
I0712 08:49:36.068] has:error\: USAGE\: cordon NODE
I0712 08:49:36.148] node/127.0.0.1 already uncordoned
I0712 08:49:36.230] Successful
I0712 08:49:36.230] message:error: You must provide one or more resources by argument or filename.
I0712 08:49:36.231] Example resource specifications include:
I0712 08:49:36.231]    '-f rsrc.yaml'
I0712 08:49:36.231]    '--filename=rsrc.json'
I0712 08:49:36.231]    '<resource> <name>'
I0712 08:49:36.232]    '<resource>'
I0712 08:49:36.232] has:must provide one or more resources
... skipping 15 lines ...
I0712 08:49:36.664] Successful
I0712 08:49:36.665] message:The following compatible plugins are available:
I0712 08:49:36.665] 
I0712 08:49:36.665] test/fixtures/pkg/kubectl/plugins/version/kubectl-version
I0712 08:49:36.665]   - warning: kubectl-version overwrites existing command: "kubectl version"
I0712 08:49:36.665] 
I0712 08:49:36.665] error: one plugin warning was found
I0712 08:49:36.666] has:kubectl-version overwrites existing command: "kubectl version"
I0712 08:49:36.735] Successful
I0712 08:49:36.736] message:The following compatible plugins are available:
I0712 08:49:36.736] 
I0712 08:49:36.736] test/fixtures/pkg/kubectl/plugins/kubectl-foo
I0712 08:49:36.736] test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo
I0712 08:49:36.736]   - warning: test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin: test/fixtures/pkg/kubectl/plugins/kubectl-foo
I0712 08:49:36.736] 
I0712 08:49:36.736] error: one plugin warning was found
I0712 08:49:36.737] has:test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin
I0712 08:49:36.813] Successful
I0712 08:49:36.814] message:The following compatible plugins are available:
I0712 08:49:36.814] 
I0712 08:49:36.814] test/fixtures/pkg/kubectl/plugins/kubectl-foo
I0712 08:49:36.814] has:plugins are available
I0712 08:49:36.897] Successful
I0712 08:49:36.897] message:Unable read directory "test/fixtures/pkg/kubectl/plugins/empty" from your PATH: open test/fixtures/pkg/kubectl/plugins/empty: no such file or directory. Skipping...
I0712 08:49:36.897] error: unable to find any kubectl plugins in your PATH
I0712 08:49:36.897] has:unable to find any kubectl plugins in your PATH
I0712 08:49:36.977] Successful
I0712 08:49:36.978] message:I am plugin foo
I0712 08:49:36.978] has:plugin foo
I0712 08:49:37.051] Successful
I0712 08:49:37.051] message:I am plugin bar called with args test/fixtures/pkg/kubectl/plugins/bar/kubectl-bar arg1
... skipping 12 lines ...
I0712 08:49:37.208] 
I0712 08:49:37.211] +++ Running case: test-cmd.run_impersonation_tests 
I0712 08:49:37.215] +++ working dir: /go/src/k8s.io/kubernetes
I0712 08:49:37.217] +++ command: run_impersonation_tests
I0712 08:49:37.227] +++ [0712 08:49:37] Testing impersonation
I0712 08:49:37.302] Successful
I0712 08:49:37.302] message:error: requesting groups or user-extra for  without impersonating a user
I0712 08:49:37.302] has:without impersonating a user
W0712 08:49:37.403] E0712 08:49:33.910631   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:37.404] E0712 08:49:33.984251   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:37.404] E0712 08:49:34.071201   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:37.404] E0712 08:49:34.170961   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:37.405] E0712 08:49:34.912241   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:37.405] E0712 08:49:34.985711   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:37.405] E0712 08:49:35.072938   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:37.405] E0712 08:49:35.172934   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:37.406] E0712 08:49:35.913497   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:37.406] E0712 08:49:35.987424   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:37.406] E0712 08:49:36.074389   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:37.407] E0712 08:49:36.174717   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:37.407] E0712 08:49:36.915092   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:37.408] E0712 08:49:36.988983   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:37.408] E0712 08:49:37.075966   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:37.408] E0712 08:49:37.176055   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:49:37.510] certificatesigningrequest.certificates.k8s.io/foo created
I0712 08:49:37.612] authorization.sh:68: Successful get csr/foo {{.spec.username}}: user1
I0712 08:49:37.704] authorization.sh:69: Successful get csr/foo {{range .spec.groups}}{{.}}{{end}}: system:authenticated
I0712 08:49:37.780] certificatesigningrequest.certificates.k8s.io "foo" deleted
W0712 08:49:37.917] E0712 08:49:37.916643   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:37.991] E0712 08:49:37.990567   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:38.078] E0712 08:49:38.077639   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:38.178] E0712 08:49:38.177936   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0712 08:49:38.279] certificatesigningrequest.certificates.k8s.io/foo created
I0712 08:49:38.280] authorization.sh:74: Successful get csr/foo {{len .spec.groups}}: 3
I0712 08:49:38.280] authorization.sh:75: Successful get csr/foo {{range .spec.groups}}{{.}} {{end}}: group2 group1 ,,,chameleon 
I0712 08:49:38.280] certificatesigningrequest.certificates.k8s.io "foo" deleted
I0712 08:49:38.280] +++ exit code: 0
I0712 08:49:38.280] Recording: run_wait_tests
... skipping 21 lines ...
I0712 08:49:40.839] has:test-2 condition met
I0712 08:49:40.853] +++ exit code: 0
W0712 08:49:40.954] I0712 08:49:38.555923   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921378-15279", Name:"test-1", UID:"2e653d50-ed16-4d8b-8648-723e23a9e490", APIVersion:"apps/v1", ResourceVersion:"3027", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-1-78b87bbcf7 to 1
W0712 08:49:40.955] I0712 08:49:38.561984   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921378-15279", Name:"test-1-78b87bbcf7", UID:"4f632104-33a1-4206-a6e9-9a99642840af", APIVersion:"apps/v1", ResourceVersion:"3028", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-1-78b87bbcf7-wl7zg
W0712 08:49:40.955] I0712 08:49:38.639233   51895 event.go:255] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1562921378-15279", Name:"test-2", UID:"ca921398-4847-4e11-ba85-aec0cfed52f1", APIVersion:"apps/v1", ResourceVersion:"3037", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-2-7dd97dbdbb to 1
W0712 08:49:40.956] I0712 08:49:38.644083   51895 event.go:255] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1562921378-15279", Name:"test-2-7dd97dbdbb", UID:"87c537c3-eb3c-4997-8e17-0477c3dd56d2", APIVersion:"apps/v1", ResourceVersion:"3038", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-2-7dd97dbdbb-jknjp
W0712 08:49:40.956] E0712 08:49:38.918125   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:40.956] E0712 08:49:38.992213   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:40.957] E0712 08:49:39.079177   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:40.957] E0712 08:49:39.179536   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:40.957] E0712 08:49:39.919571   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:40.958] E0712 08:49:39.993931   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:40.958] E0712 08:49:40.080576   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:40.958] E0712 08:49:40.181128   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:40.958] E0712 08:49:40.921213   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:40.964] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0712 08:49:40.996] E0712 08:49:40.995565   51895 reflector.go:125] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0712 08:49:41.043] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0712 08:49:41.061] I0712 08:49:41.061240   48560 controller.go:176] Shutting down kubernetes service endpoint reconciler
W0712 08:49:41.067] W0712 08:49:41.066799   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.068] W0712 08:49:41.066829   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.069] W0712 08:49:41.066842   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.070] W0712 08:49:41.066849   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.070] I0712 08:49:41.061313   48560 secure_serving.go:160] Stopped listening on 127.0.0.1:8080
W0712 08:49:41.070] I0712 08:49:41.061391   48560 crdregistration_controller.go:143] Shutting down crd-autoregister controller
W0712 08:49:41.071] W0712 08:49:41.066928   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.072] W0712 08:49:41.066806   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.072] W0712 08:49:41.066929   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.073] I0712 08:49:41.061406   48560 apiservice_controller.go:106] Shutting down APIServiceRegistrationController
W0712 08:49:41.073] I0712 08:49:41.061479   48560 available_controller.go:388] Shutting down AvailableConditionController
W0712 08:49:41.073] I0712 08:49:41.061496   48560 controller.go:87] Shutting down OpenAPI AggregationController
W0712 08:49:41.074] I0712 08:49:41.061499   48560 crd_finalizer.go:267] Shutting down CRDFinalizer
W0712 08:49:41.074] I0712 08:49:41.061564   48560 controller.go:120] Shutting down OpenAPI controller
W0712 08:49:41.075] I0712 08:49:41.061580   48560 establishing_controller.go:84] Shutting down EstablishingController
... skipping 33 lines ...
W0712 08:49:41.087] I0712 08:49:41.064804   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.087] I0712 08:49:41.064806   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.087] I0712 08:49:41.064819   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.088] I0712 08:49:41.064935   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.088] I0712 08:49:41.064956   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.088] I0712 08:49:41.064954   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.089] W0712 08:49:41.064984   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.089] W0712 08:49:41.064984   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.089] I0712 08:49:41.065006   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.090] W0712 08:49:41.065014   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.090] W0712 08:49:41.065025   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.091] W0712 08:49:41.065038   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.091] W0712 08:49:41.065058   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.091] W0712 08:49:41.065067   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.092] W0712 08:49:41.065070   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.092] W0712 08:49:41.065091   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.093] W0712 08:49:41.065102   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.094] W0712 08:49:41.065106   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.094] W0712 08:49:41.065109   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.094] W0712 08:49:41.065133   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.095] W0712 08:49:41.065141   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.095] W0712 08:49:41.065174   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.096] W0712 08:49:41.065176   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.096] W0712 08:49:41.065207   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.097] W0712 08:49:41.065214   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.097] W0712 08:49:41.065242   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.097] W0712 08:49:41.065240   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.098] W0712 08:49:41.065252   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.098] W0712 08:49:41.065275   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.099] W0712 08:49:41.065280   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.099] W0712 08:49:41.065293   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.100] W0712 08:49:41.065306   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.100] W0712 08:49:41.065323   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.100] W0712 08:49:41.065341   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.101] I0712 08:49:41.065379   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.101] W0712 08:49:41.065388   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.102] W0712 08:49:41.065471   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.102] I0712 08:49:41.061435   48560 autoregister_controller.go:164] Shutting down autoregister controller
W0712 08:49:41.102] W0712 08:49:41.065564   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.103] W0712 08:49:41.065588   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.103] I0712 08:49:41.065737   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.103] I0712 08:49:41.065761   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.104] I0712 08:49:41.065915   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.104] I0712 08:49:41.065944   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.104] I0712 08:49:41.065967   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.105] I0712 08:49:41.065989   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.105] I0712 08:49:41.065993   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.106] I0712 08:49:41.066026   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.106] W0712 08:49:41.066045   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.107] I0712 08:49:41.066088   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.107] I0712 08:49:41.066113   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.107] I0712 08:49:41.066138   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.108] I0712 08:49:41.066179   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.108] I0712 08:49:41.066213   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.108] I0712 08:49:41.066236   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
... skipping 7 lines ...
W0712 08:49:41.110] I0712 08:49:41.066538   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.110] I0712 08:49:41.066566   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.110] I0712 08:49:41.066576   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.110] I0712 08:49:41.066606   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.110] I0712 08:49:41.066624   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.110] I0712 08:49:41.066635   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.111] W0712 08:49:41.066717   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.111] W0712 08:49:41.066736   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.111] W0712 08:49:41.066732   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.111] W0712 08:49:41.066742   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.112] I0712 08:49:41.066749   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.112] W0712 08:49:41.066757   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.112] W0712 08:49:41.066777   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.112] W0712 08:49:41.066789   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.113] W0712 08:49:41.066807   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.113] W0712 08:49:41.066807   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.114] W0712 08:49:41.066856   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.114] W0712 08:49:41.066869   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.114] W0712 08:49:41.066871   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.115] W0712 08:49:41.066930   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.115] W0712 08:49:41.066940   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.115] W0712 08:49:41.066935   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.116] W0712 08:49:41.066969   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.116] W0712 08:49:41.066972   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.116] I0712 08:49:41.061392   48560 nonstructuralschema_controller.go:203] Shutting down NonStructuralSchemaConditionController
W0712 08:49:41.117] W0712 08:49:41.066995   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.117] W0712 08:49:41.066991   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.117] W0712 08:49:41.067013   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.117] W0712 08:49:41.067026   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.117] W0712 08:49:41.067045   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.118] W0712 08:49:41.067069   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.118] W0712 08:49:41.067082   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:41.118] I0712 08:49:41.067945   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.118] I0712 08:49:41.067964   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.118] I0712 08:49:41.068015   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.119] I0712 08:49:41.068017   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.119] I0712 08:49:41.068051   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:41.119] I0712 08:49:41.068101   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
... skipping 55 lines ...
W0712 08:49:41.127] + make test-integration
I0712 08:49:41.228] No resources found
I0712 08:49:41.228] No resources found
I0712 08:49:41.228] +++ [0712 08:49:41] TESTS PASSED
I0712 08:49:41.228] junit report dir: /workspace/artifacts
I0712 08:49:41.228] +++ [0712 08:49:41] Clean up complete
W0712 08:49:42.064] W0712 08:49:42.063362   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.064] W0712 08:49:42.063549   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.065] W0712 08:49:42.063649   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.065] W0712 08:49:42.063719   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.065] W0712 08:49:42.063731   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.066] W0712 08:49:42.064061   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.066] W0712 08:49:42.064072   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.067] W0712 08:49:42.064085   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.067] W0712 08:49:42.064100   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.067] W0712 08:49:42.064105   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.068] W0712 08:49:42.064097   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.068] W0712 08:49:42.064126   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.069] W0712 08:49:42.064134   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.069] W0712 08:49:42.064148   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.069] W0712 08:49:42.064224   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.070] W0712 08:49:42.064260   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.070] W0712 08:49:42.064479   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.071] W0712 08:49:42.065146   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.071] W0712 08:49:42.064491   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.071] W0712 08:49:42.064539   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.072] W0712 08:49:42.064745   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.072] W0712 08:49:42.064792   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.073] W0712 08:49:42.064814   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.073] W0712 08:49:42.064832   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.074] W0712 08:49:42.064859   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.074] W0712 08:49:42.064929   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.074] W0712 08:49:42.064976   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.075] W0712 08:49:42.065021   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.075] W0712 08:49:42.065063   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.076] W0712 08:49:42.065075   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.076] W0712 08:49:42.065082   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.076] W0712 08:49:42.065088   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.077] W0712 08:49:42.065120   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.077] W0712 08:49:42.065126   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.077] W0712 08:49:42.065382   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.078] W0712 08:49:42.065783   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.078] W0712 08:49:42.065842   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.078] W0712 08:49:42.065866   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.078] W0712 08:49:42.065923   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.079] W0712 08:49:42.066040   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.079] W0712 08:49:42.066109   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.079] W0712 08:49:42.066170   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.079] W0712 08:49:42.066222   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.080] W0712 08:49:42.066256   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.080] W0712 08:49:42.066255   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.080] W0712 08:49:42.066271   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.080] W0712 08:49:42.066357   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.081] W0712 08:49:42.066414   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.081] W0712 08:49:42.066475   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.081] W0712 08:49:42.066477   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.081] W0712 08:49:42.066718   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.082] W0712 08:49:42.066766   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.082] W0712 08:49:42.066816   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.082] W0712 08:49:42.066852   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.082] W0712 08:49:42.066855   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.083] W0712 08:49:42.066901   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.083] W0712 08:49:42.066932   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.083] W0712 08:49:42.066957   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.083] W0712 08:49:42.067026   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.084] W0712 08:49:42.067195   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.084] W0712 08:49:42.067195   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.084] W0712 08:49:42.067421   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:42.084] W0712 08:49:42.067668   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.351] W0712 08:49:43.350558   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.352] W0712 08:49:43.350847   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.356] W0712 08:49:43.355541   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.361] W0712 08:49:43.360982   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.388] W0712 08:49:43.388142   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.391] W0712 08:49:43.391234   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.398] W0712 08:49:43.398125   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.410] W0712 08:49:43.410133   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.416] W0712 08:49:43.416341   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.457] W0712 08:49:43.456590   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.466] W0712 08:49:43.465955   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.467] W0712 08:49:43.467427   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.521] W0712 08:49:43.520420   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.522] W0712 08:49:43.520637   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.524] W0712 08:49:43.523949   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.533] W0712 08:49:43.533157   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.534] W0712 08:49:43.533735   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.563] W0712 08:49:43.562404   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.578] W0712 08:49:43.577753   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.578] W0712 08:49:43.577834   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.609] W0712 08:49:43.608776   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.614] W0712 08:49:43.614124   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.629] W0712 08:49:43.629278   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.633] W0712 08:49:43.633282   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.650] W0712 08:49:43.650195   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.659] W0712 08:49:43.659206   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.662] W0712 08:49:43.661779   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.686] W0712 08:49:43.685310   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.708] W0712 08:49:43.708236   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.711] W0712 08:49:43.711481   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.728] W0712 08:49:43.727587   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.734] W0712 08:49:43.733690   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.734] W0712 08:49:43.734188   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.735] W0712 08:49:43.735134   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.745] W0712 08:49:43.745287   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.754] W0712 08:49:43.753677   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.759] W0712 08:49:43.758375   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.771] W0712 08:49:43.770460   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.771] W0712 08:49:43.770871   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.795] W0712 08:49:43.794228   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.812] W0712 08:49:43.811814   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.818] W0712 08:49:43.817699   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.826] W0712 08:49:43.826165   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.829] W0712 08:49:43.828851   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.841] W0712 08:49:43.840751   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.850] W0712 08:49:43.849398   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.854] W0712 08:49:43.853741   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.861] W0712 08:49:43.861068   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.879] W0712 08:49:43.878713   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.892] W0712 08:49:43.891385   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.894] W0712 08:49:43.893965   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.898] W0712 08:49:43.898202   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.902] W0712 08:49:43.902088   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.912] W0712 08:49:43.912309   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.921] W0712 08:49:43.920398   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.933] W0712 08:49:43.932864   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.937] W0712 08:49:43.936589   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.949] W0712 08:49:43.948828   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.956] W0712 08:49:43.956165   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.960] W0712 08:49:43.959388   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.973] W0712 08:49:43.973112   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.976] W0712 08:49:43.976246   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:43.977] W0712 08:49:43.977381   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:45.518] W0712 08:49:45.517843   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:45.536] W0712 08:49:45.536196   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:45.597] W0712 08:49:45.597223   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:45.674] W0712 08:49:45.673363   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:45.688] W0712 08:49:45.687329   48560 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0712 08:49:45.749] I0712 08:49:45.748383   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:45.754] I0712 08:49:45.753596   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:45.795] I0712 08:49:45.794452   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:45.892] I0712 08:49:45.891514   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:45.897] I0712 08:49:45.897054   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0712 08:49:45.949] I0712 08:49:45.949280   48560 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
... skipping 50 lines ...
I0712 09:03:35.471] ok  	k8s.io/kubernetes/test/integration/serving	48.525s
I0712 09:03:35.471] ok  	k8s.io/kubernetes/test/integration/statefulset	11.300s
I0712 09:03:35.471] ok  	k8s.io/kubernetes/test/integration/storageclasses	3.659s
I0712 09:03:35.471] ok  	k8s.io/kubernetes/test/integration/tls	7.920s
I0712 09:03:35.471] ok  	k8s.io/kubernetes/test/integration/ttlcontroller	9.819s
I0712 09:03:35.471] ok  	k8s.io/kubernetes/test/integration/volume	93.023s
I0712 09:03:35.471] FAIL	k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/test/integration	210.221s
I0712 09:03:35.471] ok  	k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/test/integration/conversion	14.157s
I0712 09:03:47.052] +++ [0712 09:03:47] Saved JUnit XML test report to /workspace/artifacts/junit_b08e264f3d2ff14dff3b873d155e207833096397_20190712-084950.xml
I0712 09:03:47.055] Makefile:185: recipe for target 'test' failed
I0712 09:03:47.066] +++ [0712 09:03:47] Cleaning up etcd
W0712 09:03:47.167] make[1]: *** [test] Error 1
W0712 09:03:47.168] !!! [0712 09:03:47] Call tree:
W0712 09:03:47.168] !!! [0712 09:03:47]  1: hack/make-rules/test-integration.sh:89 runTests(...)
I0712 09:03:47.393] +++ [0712 09:03:47] Integration test cleanup complete
I0712 09:03:47.395] Makefile:204: recipe for target 'test-integration' failed
W0712 09:03:47.496] make: *** [test-integration] Error 1
W0712 09:03:48.758] Traceback (most recent call last):
W0712 09:03:48.759]   File "/workspace/./test-infra/jenkins/../scenarios/kubernetes_verify.py", line 178, in <module>
W0712 09:03:48.759]     ARGS.exclude_typecheck, ARGS.exclude_godep)
W0712 09:03:48.759]   File "/workspace/./test-infra/jenkins/../scenarios/kubernetes_verify.py", line 140, in main
W0712 09:03:48.759]     check(*cmd)
W0712 09:03:48.759]   File "/workspace/./test-infra/jenkins/../scenarios/kubernetes_verify.py", line 48, in check
W0712 09:03:48.760]     subprocess.check_call(cmd)
W0712 09:03:48.760]   File "/usr/lib/python2.7/subprocess.py", line 186, in check_call
W0712 09:03:48.768]     raise CalledProcessError(retcode, cmd)
W0712 09:03:48.768] subprocess.CalledProcessError: Command '('docker', 'run', '--rm=true', '--privileged=true', '-v', '/var/run/docker.sock:/var/run/docker.sock', '-v', '/etc/localtime:/etc/localtime:ro', '-v', '/workspace/k8s.io/kubernetes:/go/src/k8s.io/kubernetes', '-v', '/workspace/k8s.io/:/workspace/k8s.io/', '-v', '/workspace/_artifacts:/workspace/artifacts', '-e', 'KUBE_FORCE_VERIFY_CHECKS=n', '-e', 'KUBE_VERIFY_GIT_BRANCH=master', '-e', 'EXCLUDE_TYPECHECK=n', '-e', 'EXCLUDE_GODEP=n', '-e', 'REPO_DIR=/workspace/k8s.io/kubernetes', '--tmpfs', '/tmp:exec,mode=1777', 'gcr.io/k8s-testimages/kubekins-test:1.14-v20190318-2ac98e338', 'bash', '-c', 'cd kubernetes && ./hack/jenkins/test-dockerized.sh')' returned non-zero exit status 2
E0712 09:03:48.774] Command failed
I0712 09:03:48.774] process 655 exited with code 1 after 26.9m
E0712 09:03:48.775] FAIL: pull-kubernetes-integration
I0712 09:03:48.775] Call:  gcloud auth activate-service-account --key-file=/etc/service-account/service-account.json
W0712 09:03:49.284] Activated service account credentials for: [pr-kubekins@kubernetes-jenkins-pull.iam.gserviceaccount.com]
I0712 09:03:49.343] process 112634 exited with code 0 after 0.0m
I0712 09:03:49.344] Call:  gcloud config get-value account
I0712 09:03:49.637] process 112646 exited with code 0 after 0.0m
I0712 09:03:49.638] Will upload results to gs://kubernetes-jenkins/pr-logs using pr-kubekins@kubernetes-jenkins-pull.iam.gserviceaccount.com
I0712 09:03:49.638] Upload result and artifacts...
I0712 09:03:49.638] Gubernator results at https://gubernator.k8s.io/build/kubernetes-jenkins/pr-logs/pull/78447/pull-kubernetes-integration/1149598207756472322
I0712 09:03:49.639] Call:  gsutil ls gs://kubernetes-jenkins/pr-logs/pull/78447/pull-kubernetes-integration/1149598207756472322/artifacts
W0712 09:03:50.726] CommandException: One or more URLs matched no objects.
E0712 09:03:50.856] Command failed
I0712 09:03:50.856] process 112658 exited with code 1 after 0.0m
W0712 09:03:50.856] Remote dir gs://kubernetes-jenkins/pr-logs/pull/78447/pull-kubernetes-integration/1149598207756472322/artifacts not exist yet
I0712 09:03:50.856] Call:  gsutil -m -q -o GSUtil:use_magicfile=True cp -r -c -z log,txt,xml /workspace/_artifacts gs://kubernetes-jenkins/pr-logs/pull/78447/pull-kubernetes-integration/1149598207756472322/artifacts
I0712 09:03:54.670] process 112800 exited with code 0 after 0.1m
W0712 09:03:54.670] metadata path /workspace/_artifacts/metadata.json does not exist
W0712 09:03:54.670] metadata not found or invalid, init with empty metadata
... skipping 23 lines ...