This job view page is being replaced by Spyglass soon. Check out the new job view.
PRspiffxp: cleanup container probe tests and add tcp probe case
ResultFAILURE
Tests 1 failed / 1398 succeeded
Started2019-05-15 22:20
Elapsed29m58s
Revision
Buildergke-prow-containerd-pool-99179761-jh56
Refs master:aaec77a9
70658:41770276
pod837bcb62-775f-11e9-8ee0-0a580a6c0dad
infra-commit0f0e3e066
pod837bcb62-775f-11e9-8ee0-0a580a6c0dad
repok8s.io/kubernetes
repo-commit6936c2e3ebb85f930b4f974050c2e47abe7f30c7
repos{u'k8s.io/kubernetes': u'master:aaec77a94b67878ca1bdd884f2778f4388d203f2,70658:41770276b775d9d6e3420a048f3da850cde2d834'}

Test Failures


k8s.io/kubernetes/test/integration/auth [build failed] 0.00s

k8s.io/kubernetes/test/integration/auth [build failed]
from junit_d431ed5f68ae4ddf888439fb96b687a923412204_20190515-223558.xml

Show 1398 Passed Tests

Show 4 Skipped Tests

Error lines from build-log.txt

... skipping 314 lines ...
W0515 22:29:55.709] I0515 22:29:55.708841   47798 serving.go:312] Generated self-signed cert (/tmp/apiserver.crt, /tmp/apiserver.key)
W0515 22:29:55.710] I0515 22:29:55.708929   47798 server.go:558] external host was not specified, using 172.17.0.2
W0515 22:29:55.710] W0515 22:29:55.708953   47798 authentication.go:415] AnonymousAuth is not allowed with the AlwaysAllow authorizer. Resetting AnonymousAuth to false. You should use a different authorizer
W0515 22:29:55.710] I0515 22:29:55.709428   47798 server.go:145] Version: v1.16.0-alpha.0.62+6936c2e3ebb85f
W0515 22:29:56.449] I0515 22:29:56.448566   47798 plugins.go:158] Loaded 4 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority.
W0515 22:29:56.449] I0515 22:29:56.448598   47798 plugins.go:161] Loaded 4 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ResourceQuota.
W0515 22:29:56.449] E0515 22:29:56.449008   47798 prometheus.go:55] failed to register depth metric admission_quota_controller: duplicate metrics collector registration attempted
W0515 22:29:56.450] E0515 22:29:56.449052   47798 prometheus.go:68] failed to register adds metric admission_quota_controller: duplicate metrics collector registration attempted
W0515 22:29:56.450] E0515 22:29:56.449074   47798 prometheus.go:82] failed to register latency metric admission_quota_controller: duplicate metrics collector registration attempted
W0515 22:29:56.450] E0515 22:29:56.449088   47798 prometheus.go:96] failed to register workDuration metric admission_quota_controller: duplicate metrics collector registration attempted
W0515 22:29:56.450] E0515 22:29:56.449127   47798 prometheus.go:112] failed to register unfinished metric admission_quota_controller: duplicate metrics collector registration attempted
W0515 22:29:56.450] E0515 22:29:56.449151   47798 prometheus.go:126] failed to register unfinished metric admission_quota_controller: duplicate metrics collector registration attempted
W0515 22:29:56.451] E0515 22:29:56.449171   47798 prometheus.go:152] failed to register depth metric admission_quota_controller: duplicate metrics collector registration attempted
W0515 22:29:56.451] E0515 22:29:56.449189   47798 prometheus.go:164] failed to register adds metric admission_quota_controller: duplicate metrics collector registration attempted
W0515 22:29:56.451] E0515 22:29:56.449267   47798 prometheus.go:176] failed to register latency metric admission_quota_controller: duplicate metrics collector registration attempted
W0515 22:29:56.451] E0515 22:29:56.449313   47798 prometheus.go:188] failed to register work_duration metric admission_quota_controller: duplicate metrics collector registration attempted
W0515 22:29:56.451] E0515 22:29:56.449340   47798 prometheus.go:203] failed to register unfinished_work_seconds metric admission_quota_controller: duplicate metrics collector registration attempted
W0515 22:29:56.452] E0515 22:29:56.449363   47798 prometheus.go:216] failed to register longest_running_processor_microseconds metric admission_quota_controller: duplicate metrics collector registration attempted
W0515 22:29:56.452] I0515 22:29:56.449376   47798 plugins.go:158] Loaded 4 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority.
W0515 22:29:56.452] I0515 22:29:56.449381   47798 plugins.go:161] Loaded 4 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ResourceQuota.
W0515 22:29:56.452] I0515 22:29:56.450969   47798 client.go:354] parsed scheme: ""
W0515 22:29:56.452] I0515 22:29:56.450991   47798 client.go:354] scheme "" not registered, fallback to default scheme
W0515 22:29:56.452] I0515 22:29:56.451097   47798 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
W0515 22:29:56.453] I0515 22:29:56.451171   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
... skipping 361 lines ...
W0515 22:29:57.005] W0515 22:29:57.004979   47798 genericapiserver.go:347] Skipping API storage.k8s.io/v1alpha1 because it has no resources.
W0515 22:29:57.447] I0515 22:29:57.447107   47798 client.go:354] parsed scheme: ""
W0515 22:29:57.448] I0515 22:29:57.447152   47798 client.go:354] scheme "" not registered, fallback to default scheme
W0515 22:29:57.448] I0515 22:29:57.447210   47798 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
W0515 22:29:57.448] I0515 22:29:57.447389   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:29:57.448] I0515 22:29:57.448038   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:29:57.920] E0515 22:29:57.919684   47798 prometheus.go:55] failed to register depth metric admission_quota_controller: duplicate metrics collector registration attempted
W0515 22:29:57.921] E0515 22:29:57.919803   47798 prometheus.go:68] failed to register adds metric admission_quota_controller: duplicate metrics collector registration attempted
W0515 22:29:57.921] E0515 22:29:57.919824   47798 prometheus.go:82] failed to register latency metric admission_quota_controller: duplicate metrics collector registration attempted
W0515 22:29:57.921] E0515 22:29:57.919838   47798 prometheus.go:96] failed to register workDuration metric admission_quota_controller: duplicate metrics collector registration attempted
W0515 22:29:57.921] E0515 22:29:57.919869   47798 prometheus.go:112] failed to register unfinished metric admission_quota_controller: duplicate metrics collector registration attempted
W0515 22:29:57.922] E0515 22:29:57.919905   47798 prometheus.go:126] failed to register unfinished metric admission_quota_controller: duplicate metrics collector registration attempted
W0515 22:29:57.922] E0515 22:29:57.919992   47798 prometheus.go:152] failed to register depth metric admission_quota_controller: duplicate metrics collector registration attempted
W0515 22:29:57.922] E0515 22:29:57.920005   47798 prometheus.go:164] failed to register adds metric admission_quota_controller: duplicate metrics collector registration attempted
W0515 22:29:57.923] E0515 22:29:57.920059   47798 prometheus.go:176] failed to register latency metric admission_quota_controller: duplicate metrics collector registration attempted
W0515 22:29:57.923] E0515 22:29:57.920116   47798 prometheus.go:188] failed to register work_duration metric admission_quota_controller: duplicate metrics collector registration attempted
W0515 22:29:57.923] E0515 22:29:57.920161   47798 prometheus.go:203] failed to register unfinished_work_seconds metric admission_quota_controller: duplicate metrics collector registration attempted
W0515 22:29:57.924] E0515 22:29:57.920176   47798 prometheus.go:216] failed to register longest_running_processor_microseconds metric admission_quota_controller: duplicate metrics collector registration attempted
W0515 22:29:57.924] I0515 22:29:57.920197   47798 plugins.go:158] Loaded 4 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority.
W0515 22:29:57.924] I0515 22:29:57.920208   47798 plugins.go:161] Loaded 4 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ResourceQuota.
W0515 22:29:57.924] I0515 22:29:57.921507   47798 client.go:354] parsed scheme: ""
W0515 22:29:57.925] I0515 22:29:57.921535   47798 client.go:354] scheme "" not registered, fallback to default scheme
W0515 22:29:57.925] I0515 22:29:57.921572   47798 asm_amd64.s:1337] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
W0515 22:29:57.925] I0515 22:29:57.921611   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
... skipping 107 lines ...
W0515 22:30:39.565] I0515 22:30:39.560406   51154 controller_utils.go:1029] Waiting for caches to sync for resource quota controller
W0515 22:30:39.565] I0515 22:30:39.560440   51154 resource_quota_monitor.go:303] QuotaMonitor running
W0515 22:30:39.565] I0515 22:30:39.561293   51154 controllermanager.go:523] Started "deployment"
W0515 22:30:39.565] I0515 22:30:39.561556   51154 deployment_controller.go:152] Starting deployment controller
W0515 22:30:39.565] I0515 22:30:39.561593   51154 controller_utils.go:1029] Waiting for caches to sync for deployment controller
W0515 22:30:39.565] I0515 22:30:39.561765   51154 node_lifecycle_controller.go:77] Sending events to api server
W0515 22:30:39.566] E0515 22:30:39.561822   51154 core.go:160] failed to start cloud node lifecycle controller: no cloud provider provided
W0515 22:30:39.566] W0515 22:30:39.561837   51154 controllermanager.go:515] Skipping "cloud-node-lifecycle"
W0515 22:30:39.566] I0515 22:30:39.562595   51154 controllermanager.go:523] Started "replicationcontroller"
W0515 22:30:39.566] I0515 22:30:39.562728   51154 replica_set.go:182] Starting replicationcontroller controller
W0515 22:30:39.566] I0515 22:30:39.562776   51154 controller_utils.go:1029] Waiting for caches to sync for ReplicationController controller
W0515 22:30:39.566] I0515 22:30:39.563549   51154 controllermanager.go:523] Started "replicaset"
W0515 22:30:39.567] I0515 22:30:39.563745   51154 replica_set.go:182] Starting replicaset controller
... skipping 38 lines ...
W0515 22:30:39.999] I0515 22:30:39.999107   51154 controller_utils.go:1029] Waiting for caches to sync for disruption controller
W0515 22:30:40.000] I0515 22:30:39.999872   51154 controllermanager.go:523] Started "csrcleaner"
W0515 22:30:40.000] I0515 22:30:39.999967   51154 cleaner.go:81] Starting CSR cleaner controller
W0515 22:30:40.000] I0515 22:30:40.000419   51154 controllermanager.go:523] Started "ttl"
W0515 22:30:40.000] I0515 22:30:40.000760   51154 ttl_controller.go:116] Starting TTL controller
W0515 22:30:40.001] I0515 22:30:40.000790   51154 controller_utils.go:1029] Waiting for caches to sync for TTL controller
W0515 22:30:40.002] E0515 22:30:40.002127   51154 core.go:76] Failed to start service controller: WARNING: no cloud provider provided, services of type LoadBalancer will fail
W0515 22:30:40.002] W0515 22:30:40.002200   51154 controllermanager.go:515] Skipping "service"
W0515 22:30:40.003] I0515 22:30:40.002211   51154 core.go:170] Will not configure cloud provider routes for allocate-node-cidrs: false, configure-cloud-routes: true.
W0515 22:30:40.003] W0515 22:30:40.002217   51154 controllermanager.go:515] Skipping "route"
W0515 22:30:40.003] W0515 22:30:40.002840   51154 probe.go:268] Flexvolume plugin directory at /usr/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating.
W0515 22:30:40.005] I0515 22:30:40.004657   51154 controllermanager.go:523] Started "attachdetach"
W0515 22:30:40.005] I0515 22:30:40.004821   51154 attach_detach_controller.go:335] Starting attach detach controller
W0515 22:30:40.005] I0515 22:30:40.004834   51154 controller_utils.go:1029] Waiting for caches to sync for attach detach controller
W0515 22:30:40.005] I0515 22:30:40.005364   51154 controllermanager.go:523] Started "serviceaccount"
W0515 22:30:40.005] I0515 22:30:40.005440   51154 serviceaccounts_controller.go:115] Starting service account controller
W0515 22:30:40.006] I0515 22:30:40.005485   51154 controller_utils.go:1029] Waiting for caches to sync for service account controller
W0515 22:30:40.007] I0515 22:30:40.006614   51154 controllermanager.go:523] Started "job"
W0515 22:30:40.007] I0515 22:30:40.006649   51154 job_controller.go:143] Starting job controller
W0515 22:30:40.007] I0515 22:30:40.006704   51154 controller_utils.go:1029] Waiting for caches to sync for job controller
W0515 22:30:40.019] W0515 22:30:40.018914   51154 actual_state_of_world.go:503] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="127.0.0.1" does not exist
W0515 22:30:40.044] I0515 22:30:40.043931   51154 controller_utils.go:1036] Caches are synced for PV protection controller
W0515 22:30:40.049] I0515 22:30:40.049308   51154 controller_utils.go:1036] Caches are synced for namespace controller
W0515 22:30:40.063] I0515 22:30:40.062737   51154 controller_utils.go:1036] Caches are synced for deployment controller
W0515 22:30:40.067] I0515 22:30:40.066608   51154 controller_utils.go:1036] Caches are synced for ReplicaSet controller
W0515 22:30:40.068] I0515 22:30:40.066620   51154 controller_utils.go:1036] Caches are synced for ReplicationController controller
W0515 22:30:40.068] I0515 22:30:40.066726   51154 controller_utils.go:1036] Caches are synced for certificate controller
... skipping 27 lines ...
I0515 22:30:40.365]   "goVersion": "go1.12.1",
I0515 22:30:40.365]   "compiler": "gc",
I0515 22:30:40.365]   "platform": "linux/amd64"
I0515 22:30:40.552] }+++ [0515 22:30:40] Testing kubectl version: check client only output matches expected output
W0515 22:30:40.653] I0515 22:30:40.376621   51154 controller_utils.go:1036] Caches are synced for persistent volume controller
W0515 22:30:40.653] I0515 22:30:40.542455   51154 controller_utils.go:1036] Caches are synced for ClusterRoleAggregator controller
W0515 22:30:40.653] E0515 22:30:40.555525   51154 clusterroleaggregation_controller.go:180] edit failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "edit": the object has been modified; please apply your changes to the latest version and try again
W0515 22:30:40.654] E0515 22:30:40.559751   51154 clusterroleaggregation_controller.go:180] admin failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "admin": the object has been modified; please apply your changes to the latest version and try again
W0515 22:30:40.654] E0515 22:30:40.573362   51154 clusterroleaggregation_controller.go:180] admin failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "admin": the object has been modified; please apply your changes to the latest version and try again
W0515 22:30:40.680] I0515 22:30:40.679614   51154 controller_utils.go:1036] Caches are synced for HPA controller
W0515 22:30:40.685] I0515 22:30:40.684986   51154 controller_utils.go:1036] Caches are synced for endpoint controller
I0515 22:30:40.786] Successful: the flag '--client' shows correct client info
I0515 22:30:40.786] (BSuccessful: the flag '--client' correctly has no server version info
I0515 22:30:40.786] (B+++ [0515 22:30:40] Testing kubectl version: verify json output
W0515 22:30:40.887] I0515 22:30:40.786979   51154 controller_utils.go:1036] Caches are synced for daemon sets controller
... skipping 67 lines ...
I0515 22:30:44.192] +++ working dir: /go/src/k8s.io/kubernetes
I0515 22:30:44.195] +++ command: run_RESTMapper_evaluation_tests
I0515 22:30:44.207] +++ [0515 22:30:44] Creating namespace namespace-1557959444-13298
I0515 22:30:44.287] namespace/namespace-1557959444-13298 created
I0515 22:30:44.368] Context "test" modified.
I0515 22:30:44.376] +++ [0515 22:30:44] Testing RESTMapper
I0515 22:30:44.488] +++ [0515 22:30:44] "kubectl get unknownresourcetype" returns error as expected: error: the server doesn't have a resource type "unknownresourcetype"
I0515 22:30:44.505] +++ exit code: 0
I0515 22:30:44.644] NAME                              SHORTNAMES   APIGROUP                       NAMESPACED   KIND
I0515 22:30:44.645] bindings                                                                      true         Binding
I0515 22:30:44.645] componentstatuses                 cs                                          false        ComponentStatus
I0515 22:30:44.645] configmaps                        cm                                          true         ConfigMap
I0515 22:30:44.645] endpoints                         ep                                          true         Endpoints
... skipping 661 lines ...
I0515 22:31:05.301] (Bpoddisruptionbudget.policy/test-pdb-3 created
I0515 22:31:05.399] core.sh:251: Successful get pdb/test-pdb-3 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 2
I0515 22:31:05.478] (Bpoddisruptionbudget.policy/test-pdb-4 created
I0515 22:31:05.576] core.sh:255: Successful get pdb/test-pdb-4 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 50%
I0515 22:31:05.746] (Bcore.sh:261: Successful get pods --namespace=test-kubectl-describe-pod {{range.items}}{{.metadata.name}}:{{end}}: 
I0515 22:31:05.981] (Bpod/env-test-pod created
W0515 22:31:06.081] error: resource(s) were provided, but no name, label selector, or --all flag specified
W0515 22:31:06.082] error: setting 'all' parameter but found a non empty selector. 
W0515 22:31:06.083] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0515 22:31:06.083] I0515 22:31:04.942065   47798 controller.go:606] quota admission added evaluator for: poddisruptionbudgets.policy
W0515 22:31:06.083] error: min-available and max-unavailable cannot be both specified
I0515 22:31:06.187] core.sh:264: Successful describe pods --namespace=test-kubectl-describe-pod env-test-pod:
I0515 22:31:06.187] Name:         env-test-pod
I0515 22:31:06.187] Namespace:    test-kubectl-describe-pod
I0515 22:31:06.188] Priority:     0
I0515 22:31:06.188] Node:         <none>
I0515 22:31:06.188] Labels:       <none>
... skipping 143 lines ...
I0515 22:31:18.483] (Bservice "modified" deleted
I0515 22:31:18.566] replicationcontroller "modified" deleted
I0515 22:31:18.875] core.sh:434: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0515 22:31:19.060] (Bpod/valid-pod created
I0515 22:31:19.174] core.sh:438: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0515 22:31:19.340] (BSuccessful
I0515 22:31:19.341] message:Error from server: cannot restore map from string
I0515 22:31:19.341] has:cannot restore map from string
I0515 22:31:19.435] Successful
I0515 22:31:19.435] message:pod/valid-pod patched (no change)
I0515 22:31:19.435] has:patched (no change)
I0515 22:31:19.526] pod/valid-pod patched
I0515 22:31:19.627] core.sh:455: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
... skipping 4 lines ...
I0515 22:31:20.084] core.sh:465: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
I0515 22:31:20.169] (Bpod/valid-pod patched
I0515 22:31:20.268] core.sh:470: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: changed-with-yaml:
I0515 22:31:20.349] (Bpod/valid-pod patched
I0515 22:31:20.452] core.sh:475: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:3.1:
I0515 22:31:20.621] (Bpod/valid-pod patched
W0515 22:31:20.721] E0515 22:31:19.330553   47798 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"cannot restore map from string"}
I0515 22:31:20.822] core.sh:491: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
I0515 22:31:20.926] (B+++ [0515 22:31:20] "kubectl patch with resourceVersion 499" returns error as expected: Error from server (Conflict): Operation cannot be fulfilled on pods "valid-pod": the object has been modified; please apply your changes to the latest version and try again
I0515 22:31:21.202] pod "valid-pod" deleted
I0515 22:31:21.212] pod/valid-pod replaced
I0515 22:31:21.324] core.sh:515: Successful get pod valid-pod {{(index .spec.containers 0).name}}: replaced-k8s-serve-hostname
I0515 22:31:21.523] (BSuccessful
I0515 22:31:21.523] message:error: --grace-period must have --force specified
I0515 22:31:21.523] has:\-\-grace-period must have \-\-force specified
I0515 22:31:21.723] Successful
I0515 22:31:21.723] message:error: --timeout must have --force specified
I0515 22:31:21.724] has:\-\-timeout must have \-\-force specified
I0515 22:31:21.917] node/node-v1-test created
W0515 22:31:22.018] W0515 22:31:21.918504   51154 actual_state_of_world.go:503] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="node-v1-test" does not exist
I0515 22:31:22.119] node/node-v1-test replaced
I0515 22:31:22.226] core.sh:552: Successful get node node-v1-test {{.metadata.annotations.a}}: b
I0515 22:31:22.305] (Bnode "node-v1-test" deleted
I0515 22:31:22.412] core.sh:559: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
I0515 22:31:22.749] (Bcore.sh:562: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/serve_hostname:
I0515 22:31:24.452] (Bcore.sh:575: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
... skipping 26 lines ...
I0515 22:31:25.579] (B+++ [0515 22:31:25] Creating namespace namespace-1557959485-31952
I0515 22:31:25.653] namespace/namespace-1557959485-31952 created
I0515 22:31:25.728] Context "test" modified.
I0515 22:31:25.829] core.sh:610: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0515 22:31:26.015] (Bpod/redis-master created
I0515 22:31:26.019] pod/valid-pod created
W0515 22:31:26.120] error: 'name' already has a value (valid-pod), and --overwrite is false
W0515 22:31:26.120] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
I0515 22:31:26.221] core.sh:614: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: redis-master:valid-pod:
I0515 22:31:26.235] (Bcore.sh:618: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: redis-master:valid-pod:
I0515 22:31:26.315] (Bpod "redis-master" deleted
I0515 22:31:26.321] pod "valid-pod" deleted
I0515 22:31:26.431] core.sh:622: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
... skipping 72 lines ...
I0515 22:31:33.119] +++ Running case: test-cmd.run_kubectl_create_error_tests 
I0515 22:31:33.121] +++ working dir: /go/src/k8s.io/kubernetes
I0515 22:31:33.124] +++ command: run_kubectl_create_error_tests
I0515 22:31:33.137] +++ [0515 22:31:33] Creating namespace namespace-1557959493-28980
I0515 22:31:33.212] namespace/namespace-1557959493-28980 created
I0515 22:31:33.292] Context "test" modified.
I0515 22:31:33.301] +++ [0515 22:31:33] Testing kubectl create with error
W0515 22:31:33.402] Error: must specify one of -f and -k
W0515 22:31:33.402] 
W0515 22:31:33.402] Create a resource from a file or from stdin.
W0515 22:31:33.402] 
W0515 22:31:33.403]  JSON and YAML formats are accepted.
W0515 22:31:33.403] 
W0515 22:31:33.403] Examples:
... skipping 41 lines ...
W0515 22:31:33.413] 
W0515 22:31:33.414] Usage:
W0515 22:31:33.414]   kubectl create -f FILENAME [options]
W0515 22:31:33.414] 
W0515 22:31:33.414] Use "kubectl <command> --help" for more information about a given command.
W0515 22:31:33.415] Use "kubectl options" for a list of global command-line options (applies to all commands).
I0515 22:31:33.594] +++ [0515 22:31:33] "kubectl create with empty string list returns error as expected: error: error validating "hack/testdata/invalid-rc-with-empty-args.yaml": error validating data: ValidationError(ReplicationController.spec.template.spec.containers[0].args): unknown object type "nil" in ReplicationController.spec.template.spec.containers[0].args[0]; if you choose to ignore these errors, turn validation off with --validate=false
W0515 22:31:33.697] kubectl convert is DEPRECATED and will be removed in a future version.
W0515 22:31:33.697] In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
I0515 22:31:33.798] +++ exit code: 0
I0515 22:31:33.815] Recording: run_kubectl_apply_tests
I0515 22:31:33.815] Running command: run_kubectl_apply_tests
I0515 22:31:33.838] 
... skipping 34 lines ...
I0515 22:31:36.553] +++ [0515 22:31:36] Creating namespace namespace-1557959496-6586
I0515 22:31:36.632] namespace/namespace-1557959496-6586 created
I0515 22:31:36.712] Context "test" modified.
I0515 22:31:36.722] +++ [0515 22:31:36] Testing kubectl run
I0515 22:31:36.818] run.sh:29: Successful get jobs {{range.items}}{{.metadata.name}}:{{end}}: 
I0515 22:31:36.913] (Bjob.batch/pi created
W0515 22:31:37.013] Error from server (NotFound): resources.mygroup.example.com "myobj" not found
W0515 22:31:37.014] kubectl run --generator=job/v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
W0515 22:31:37.014] I0515 22:31:36.903185   47798 controller.go:606] quota admission added evaluator for: jobs.batch
W0515 22:31:37.014] I0515 22:31:36.919778   51154 event.go:258] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1557959496-6586", Name:"pi", UID:"49b36635-71c1-4c07-8c7d-08e607783e39", APIVersion:"batch/v1", ResourceVersion:"509", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: pi-t225k
I0515 22:31:37.115] run.sh:33: Successful get jobs {{range.items}}{{.metadata.name}}:{{end}}: pi:
I0515 22:31:37.168] (BSuccessful describe pods:
I0515 22:31:37.169] Name:           pi-t225k
... skipping 81 lines ...
W0515 22:31:39.473] I0515 22:31:37.886957   51154 event.go:258] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1557959496-6586", Name:"nginx-apps", UID:"1ae8bd84-5131-4321-bb35-6c675a69b1e9", APIVersion:"apps/v1", ResourceVersion:"531", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-apps-8c9bdf9bd to 1
W0515 22:31:39.473] I0515 22:31:37.891061   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959496-6586", Name:"nginx-apps-8c9bdf9bd", UID:"3ba8e1c7-f346-4274-93bb-8411be8ce129", APIVersion:"apps/v1", ResourceVersion:"532", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-apps-8c9bdf9bd-brmj2
W0515 22:31:39.473] kubectl run --generator=cronjob/v1beta1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
W0515 22:31:39.474] I0515 22:31:38.343428   47798 controller.go:606] quota admission added evaluator for: cronjobs.batch
I0515 22:31:39.574] create.sh:34: Successful get pods selector-test-pod {{.metadata.labels.name}}: selector-test-pod
I0515 22:31:39.582] (BSuccessful
I0515 22:31:39.582] message:Error from server (NotFound): pods "selector-test-pod-dont-apply" not found
I0515 22:31:39.582] has:pods "selector-test-pod-dont-apply" not found
I0515 22:31:39.663] pod "selector-test-pod" deleted
I0515 22:31:39.685] +++ exit code: 0
I0515 22:31:39.732] Recording: run_kubectl_apply_deployments_tests
I0515 22:31:39.733] Running command: run_kubectl_apply_deployments_tests
I0515 22:31:39.756] 
... skipping 38 lines ...
W0515 22:31:42.488] I0515 22:31:42.393282   51154 event.go:258] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1557959499-19925", Name:"nginx", UID:"97949741-9472-4b93-ad78-e1b53863d5f2", APIVersion:"apps/v1", ResourceVersion:"598", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-8c9ccf86d to 3
W0515 22:31:42.489] I0515 22:31:42.397933   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959499-19925", Name:"nginx-8c9ccf86d", UID:"8f298f41-18f0-491d-9715-d54696f27f83", APIVersion:"apps/v1", ResourceVersion:"599", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-8c9ccf86d-9bldp
W0515 22:31:42.489] I0515 22:31:42.402333   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959499-19925", Name:"nginx-8c9ccf86d", UID:"8f298f41-18f0-491d-9715-d54696f27f83", APIVersion:"apps/v1", ResourceVersion:"599", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-8c9ccf86d-fpgnh
W0515 22:31:42.489] I0515 22:31:42.403516   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959499-19925", Name:"nginx-8c9ccf86d", UID:"8f298f41-18f0-491d-9715-d54696f27f83", APIVersion:"apps/v1", ResourceVersion:"599", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-8c9ccf86d-qbh25
I0515 22:31:42.590] apps.sh:147: Successful get deployment nginx {{.metadata.name}}: nginx
I0515 22:31:46.853] (BSuccessful
I0515 22:31:46.853] message:Error from server (Conflict): error when applying patch:
I0515 22:31:46.854] {"metadata":{"annotations":{"kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"extensions/v1beta1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1557959499-19925\",\"resourceVersion\":\"99\"},\"spec\":{\"replicas\":3,\"selector\":{\"matchLabels\":{\"name\":\"nginx2\"}},\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx2\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n"},"resourceVersion":"99"},"spec":{"selector":{"matchLabels":{"name":"nginx2"}},"template":{"metadata":{"labels":{"name":"nginx2"}}}}}
I0515 22:31:46.854] to:
I0515 22:31:46.854] Resource: "extensions/v1beta1, Resource=deployments", GroupVersionKind: "extensions/v1beta1, Kind=Deployment"
I0515 22:31:46.854] Name: "nginx", Namespace: "namespace-1557959499-19925"
I0515 22:31:46.856] Object: &{map["apiVersion":"extensions/v1beta1" "kind":"Deployment" "metadata":map["annotations":map["deployment.kubernetes.io/revision":"1" "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"extensions/v1beta1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1557959499-19925\"},\"spec\":{\"replicas\":3,\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx1\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n"] "creationTimestamp":"2019-05-15T22:31:42Z" "generation":'\x01' "labels":map["name":"nginx"] "managedFields":[map["apiVersion":"apps/v1" "fields":map["f:metadata":map["f:annotations":map["f:deployment.kubernetes.io/revision":map[]]] "f:status":map["f:conditions":map[".":map[] "k:{\"type\":\"Available\"}":map[".":map[] "f:lastTransitionTime":map[] "f:lastUpdateTime":map[] "f:message":map[] "f:reason":map[] "f:status":map[] "f:type":map[]]] "f:observedGeneration":map[] "f:replicas":map[] "f:unavailableReplicas":map[] "f:updatedReplicas":map[]]] "manager":"kube-controller-manager" "operation":"Update" "time":"2019-05-15T22:31:42Z"] map["apiVersion":"extensions/v1beta1" "fields":map["f:metadata":map["f:annotations":map[".":map[] "f:kubectl.kubernetes.io/last-applied-configuration":map[]] "f:labels":map[".":map[] "f:name":map[]]] "f:spec":map["f:progressDeadlineSeconds":map[] "f:replicas":map[] "f:revisionHistoryLimit":map[] "f:selector":map[".":map[] "f:matchLabels":map[".":map[] "f:name":map[]]] "f:strategy":map["f:rollingUpdate":map[".":map[] "f:maxSurge":map[] "f:maxUnavailable":map[]] "f:type":map[]] "f:template":map["f:metadata":map["f:labels":map[".":map[] "f:name":map[]]] "f:spec":map["f:containers":map["k:{\"name\":\"nginx\"}":map[".":map[] "f:image":map[] "f:imagePullPolicy":map[] "f:name":map[] "f:ports":map[".":map[] "k:{\"containerPort\":80,\"protocol\":\"TCP\"}":map[".":map[] "f:containerPort":map[] "f:protocol":map[]]] "f:resources":map[] "f:terminationMessagePath":map[] "f:terminationMessagePolicy":map[]]] "f:dnsPolicy":map[] "f:restartPolicy":map[] "f:schedulerName":map[] "f:securityContext":map[] "f:terminationGracePeriodSeconds":map[]]]]] "manager":"kubectl" "operation":"Update" "time":"2019-05-15T22:31:42Z"]] "name":"nginx" "namespace":"namespace-1557959499-19925" "resourceVersion":"611" "selfLink":"/apis/extensions/v1beta1/namespaces/namespace-1557959499-19925/deployments/nginx" "uid":"97949741-9472-4b93-ad78-e1b53863d5f2"] "spec":map["progressDeadlineSeconds":%!q(int64=+2147483647) "replicas":'\x03' "revisionHistoryLimit":%!q(int64=+2147483647) "selector":map["matchLabels":map["name":"nginx1"]] "strategy":map["rollingUpdate":map["maxSurge":'\x01' "maxUnavailable":'\x01'] "type":"RollingUpdate"] "template":map["metadata":map["creationTimestamp":<nil> "labels":map["name":"nginx1"]] "spec":map["containers":[map["image":"k8s.gcr.io/nginx:test-cmd" "imagePullPolicy":"IfNotPresent" "name":"nginx" "ports":[map["containerPort":'P' "protocol":"TCP"]] "resources":map[] "terminationMessagePath":"/dev/termination-log" "terminationMessagePolicy":"File"]] "dnsPolicy":"ClusterFirst" "restartPolicy":"Always" "schedulerName":"default-scheduler" "securityContext":map[] "terminationGracePeriodSeconds":'\x1e']]] "status":map["conditions":[map["lastTransitionTime":"2019-05-15T22:31:42Z" "lastUpdateTime":"2019-05-15T22:31:42Z" "message":"Deployment does not have minimum availability." "reason":"MinimumReplicasUnavailable" "status":"False" "type":"Available"]] "observedGeneration":'\x01' "replicas":'\x03' "unavailableReplicas":'\x03' "updatedReplicas":'\x03']]}
I0515 22:31:46.856] for: "hack/testdata/deployment-label-change2.yaml": Operation cannot be fulfilled on deployments.extensions "nginx": the object has been modified; please apply your changes to the latest version and try again
I0515 22:31:46.856] has:Error from server (Conflict)
W0515 22:31:47.500] I0515 22:31:47.500153   51154 horizontal.go:320] Horizontal Pod Autoscaler frontend has been deleted in namespace-1557959490-26822
I0515 22:31:52.187] deployment.extensions/nginx configured
W0515 22:31:52.288] I0515 22:31:52.193857   51154 event.go:258] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1557959499-19925", Name:"nginx", UID:"498f7b89-c84a-4b65-8448-2847c261bcaf", APIVersion:"apps/v1", ResourceVersion:"635", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-86bb9b4d9f to 3
W0515 22:31:52.288] I0515 22:31:52.198511   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959499-19925", Name:"nginx-86bb9b4d9f", UID:"b0c685e8-f8f5-42d3-ab56-686324998ec7", APIVersion:"apps/v1", ResourceVersion:"636", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-86bb9b4d9f-2dbsh
W0515 22:31:52.288] I0515 22:31:52.203153   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959499-19925", Name:"nginx-86bb9b4d9f", UID:"b0c685e8-f8f5-42d3-ab56-686324998ec7", APIVersion:"apps/v1", ResourceVersion:"636", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-86bb9b4d9f-txlfp
W0515 22:31:52.289] I0515 22:31:52.204285   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959499-19925", Name:"nginx-86bb9b4d9f", UID:"b0c685e8-f8f5-42d3-ab56-686324998ec7", APIVersion:"apps/v1", ResourceVersion:"636", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-86bb9b4d9f-ptb66
... skipping 169 lines ...
I0515 22:31:59.854] +++ [0515 22:31:59] Creating namespace namespace-1557959519-22372
I0515 22:31:59.930] namespace/namespace-1557959519-22372 created
I0515 22:32:00.004] Context "test" modified.
I0515 22:32:00.015] +++ [0515 22:32:00] Testing kubectl get
I0515 22:32:00.105] get.sh:29: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0515 22:32:00.192] (BSuccessful
I0515 22:32:00.192] message:Error from server (NotFound): pods "abc" not found
I0515 22:32:00.192] has:pods "abc" not found
I0515 22:32:00.284] get.sh:37: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0515 22:32:00.373] (BSuccessful
I0515 22:32:00.373] message:Error from server (NotFound): pods "abc" not found
I0515 22:32:00.374] has:pods "abc" not found
I0515 22:32:00.461] get.sh:45: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0515 22:32:00.545] (BSuccessful
I0515 22:32:00.545] message:{
I0515 22:32:00.545]     "apiVersion": "v1",
I0515 22:32:00.545]     "items": [],
... skipping 23 lines ...
I0515 22:32:00.902] has not:No resources found
I0515 22:32:00.988] Successful
I0515 22:32:00.988] message:NAME
I0515 22:32:00.989] has not:No resources found
I0515 22:32:01.078] get.sh:73: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0515 22:32:01.176] (BSuccessful
I0515 22:32:01.177] message:error: the server doesn't have a resource type "foobar"
I0515 22:32:01.177] has not:No resources found
I0515 22:32:01.261] Successful
I0515 22:32:01.262] message:No resources found.
I0515 22:32:01.262] has:No resources found
I0515 22:32:01.344] Successful
I0515 22:32:01.344] message:
I0515 22:32:01.344] has not:No resources found
I0515 22:32:01.430] Successful
I0515 22:32:01.430] message:No resources found.
I0515 22:32:01.430] has:No resources found
I0515 22:32:01.523] get.sh:93: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0515 22:32:01.611] (BSuccessful
I0515 22:32:01.611] message:Error from server (NotFound): pods "abc" not found
I0515 22:32:01.612] has:pods "abc" not found
I0515 22:32:01.613] FAIL!
I0515 22:32:01.614] message:Error from server (NotFound): pods "abc" not found
I0515 22:32:01.614] has not:List
I0515 22:32:01.614] 99 /go/src/k8s.io/kubernetes/test/cmd/../../test/cmd/get.sh
I0515 22:32:01.730] Successful
I0515 22:32:01.730] message:I0515 22:32:01.678718   61846 loader.go:359] Config loaded from file:  /tmp/tmp.JzWUpVQqPC/.kube/config
I0515 22:32:01.730] I0515 22:32:01.680161   61846 round_trippers.go:438] GET http://127.0.0.1:8080/version?timeout=32s 200 OK in 0 milliseconds
I0515 22:32:01.730] I0515 22:32:01.702278   61846 round_trippers.go:438] GET http://127.0.0.1:8080/api/v1/namespaces/default/pods 200 OK in 2 milliseconds
... skipping 888 lines ...
I0515 22:32:07.373] Successful
I0515 22:32:07.374] message:NAME    DATA   AGE
I0515 22:32:07.374] one     0      0s
I0515 22:32:07.374] three   0      0s
I0515 22:32:07.374] two     0      0s
I0515 22:32:07.374] STATUS    REASON          MESSAGE
I0515 22:32:07.375] Failure   InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
I0515 22:32:07.375] has not:watch is only supported on individual resources
I0515 22:32:08.464] Successful
I0515 22:32:08.464] message:STATUS    REASON          MESSAGE
I0515 22:32:08.464] Failure   InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
I0515 22:32:08.464] has not:watch is only supported on individual resources
I0515 22:32:08.470] +++ [0515 22:32:08] Creating namespace namespace-1557959528-26558
I0515 22:32:08.544] namespace/namespace-1557959528-26558 created
I0515 22:32:08.617] Context "test" modified.
I0515 22:32:08.716] get.sh:157: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0515 22:32:08.921] (Bpod/valid-pod created
... skipping 104 lines ...
I0515 22:32:09.039] }
I0515 22:32:09.118] get.sh:162: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0515 22:32:09.396] (B<no value>Successful
I0515 22:32:09.397] message:valid-pod:
I0515 22:32:09.397] has:valid-pod:
I0515 22:32:09.492] Successful
I0515 22:32:09.492] message:error: error executing jsonpath "{.missing}": Error executing template: missing is not found. Printing more information for debugging the template:
I0515 22:32:09.492] 	template was:
I0515 22:32:09.492] 		{.missing}
I0515 22:32:09.492] 	object given to jsonpath engine was:
I0515 22:32:09.494] 		map[string]interface {}{"apiVersion":"v1", "kind":"Pod", "metadata":map[string]interface {}{"creationTimestamp":"2019-05-15T22:32:08Z", "labels":map[string]interface {}{"name":"valid-pod"}, "managedFields":[]interface {}{map[string]interface {}{"apiVersion":"v1", "fields":map[string]interface {}{"f:metadata":map[string]interface {}{"f:labels":map[string]interface {}{".":map[string]interface {}{}, "f:name":map[string]interface {}{}}}, "f:spec":map[string]interface {}{"f:containers":map[string]interface {}{"k:{\"name\":\"kubernetes-serve-hostname\"}":map[string]interface {}{".":map[string]interface {}{}, "f:image":map[string]interface {}{}, "f:imagePullPolicy":map[string]interface {}{}, "f:name":map[string]interface {}{}, "f:resources":map[string]interface {}{".":map[string]interface {}{}, "f:limits":map[string]interface {}{".":map[string]interface {}{}, "f:cpu":map[string]interface {}{}, "f:memory":map[string]interface {}{}}, "f:requests":map[string]interface {}{".":map[string]interface {}{}, "f:cpu":map[string]interface {}{}, "f:memory":map[string]interface {}{}}}, "f:terminationMessagePath":map[string]interface {}{}, "f:terminationMessagePolicy":map[string]interface {}{}}}, "f:dnsPolicy":map[string]interface {}{}, "f:enableServiceLinks":map[string]interface {}{}, "f:priority":map[string]interface {}{}, "f:restartPolicy":map[string]interface {}{}, "f:schedulerName":map[string]interface {}{}, "f:securityContext":map[string]interface {}{}, "f:terminationGracePeriodSeconds":map[string]interface {}{}}}, "manager":"kubectl", "operation":"Update", "time":"2019-05-15T22:32:08Z"}}, "name":"valid-pod", "namespace":"namespace-1557959528-26558", "resourceVersion":"711", "selfLink":"/api/v1/namespaces/namespace-1557959528-26558/pods/valid-pod", "uid":"20395ab8-e01f-486e-9402-af92cb3e1413"}, "spec":map[string]interface {}{"containers":[]interface {}{map[string]interface {}{"image":"k8s.gcr.io/serve_hostname", "imagePullPolicy":"Always", "name":"kubernetes-serve-hostname", "resources":map[string]interface {}{"limits":map[string]interface {}{"cpu":"1", "memory":"512Mi"}, "requests":map[string]interface {}{"cpu":"1", "memory":"512Mi"}}, "terminationMessagePath":"/dev/termination-log", "terminationMessagePolicy":"File"}}, "dnsPolicy":"ClusterFirst", "enableServiceLinks":true, "priority":0, "restartPolicy":"Always", "schedulerName":"default-scheduler", "securityContext":map[string]interface {}{}, "terminationGracePeriodSeconds":30}, "status":map[string]interface {}{"phase":"Pending", "qosClass":"Guaranteed"}}
I0515 22:32:09.494] has:missing is not found
I0515 22:32:09.580] Successful
I0515 22:32:09.580] message:Error executing template: template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing". Printing more information for debugging the template:
I0515 22:32:09.581] 	template was:
I0515 22:32:09.581] 		{{.missing}}
I0515 22:32:09.581] 	raw data was:
I0515 22:32:09.582] 		{"apiVersion":"v1","kind":"Pod","metadata":{"creationTimestamp":"2019-05-15T22:32:08Z","labels":{"name":"valid-pod"},"managedFields":[{"apiVersion":"v1","fields":{"f:metadata":{"f:labels":{".":{},"f:name":{}}},"f:spec":{"f:containers":{"k:{\"name\":\"kubernetes-serve-hostname\"}":{".":{},"f:image":{},"f:imagePullPolicy":{},"f:name":{},"f:resources":{".":{},"f:limits":{".":{},"f:cpu":{},"f:memory":{}},"f:requests":{".":{},"f:cpu":{},"f:memory":{}}},"f:terminationMessagePath":{},"f:terminationMessagePolicy":{}}},"f:dnsPolicy":{},"f:enableServiceLinks":{},"f:priority":{},"f:restartPolicy":{},"f:schedulerName":{},"f:securityContext":{},"f:terminationGracePeriodSeconds":{}}},"manager":"kubectl","operation":"Update","time":"2019-05-15T22:32:08Z"}],"name":"valid-pod","namespace":"namespace-1557959528-26558","resourceVersion":"711","selfLink":"/api/v1/namespaces/namespace-1557959528-26558/pods/valid-pod","uid":"20395ab8-e01f-486e-9402-af92cb3e1413"},"spec":{"containers":[{"image":"k8s.gcr.io/serve_hostname","imagePullPolicy":"Always","name":"kubernetes-serve-hostname","resources":{"limits":{"cpu":"1","memory":"512Mi"},"requests":{"cpu":"1","memory":"512Mi"}},"terminationMessagePath":"/dev/termination-log","terminationMessagePolicy":"File"}],"dnsPolicy":"ClusterFirst","enableServiceLinks":true,"priority":0,"restartPolicy":"Always","schedulerName":"default-scheduler","securityContext":{},"terminationGracePeriodSeconds":30},"status":{"phase":"Pending","qosClass":"Guaranteed"}}
I0515 22:32:09.582] 	object given to template engine was:
I0515 22:32:09.583] 		map[apiVersion:v1 kind:Pod metadata:map[creationTimestamp:2019-05-15T22:32:08Z labels:map[name:valid-pod] managedFields:[map[apiVersion:v1 fields:map[f:metadata:map[f:labels:map[.:map[] f:name:map[]]] f:spec:map[f:containers:map[k:{"name":"kubernetes-serve-hostname"}:map[.:map[] f:image:map[] f:imagePullPolicy:map[] f:name:map[] f:resources:map[.:map[] f:limits:map[.:map[] f:cpu:map[] f:memory:map[]] f:requests:map[.:map[] f:cpu:map[] f:memory:map[]]] f:terminationMessagePath:map[] f:terminationMessagePolicy:map[]]] f:dnsPolicy:map[] f:enableServiceLinks:map[] f:priority:map[] f:restartPolicy:map[] f:schedulerName:map[] f:securityContext:map[] f:terminationGracePeriodSeconds:map[]]] manager:kubectl operation:Update time:2019-05-15T22:32:08Z]] name:valid-pod namespace:namespace-1557959528-26558 resourceVersion:711 selfLink:/api/v1/namespaces/namespace-1557959528-26558/pods/valid-pod uid:20395ab8-e01f-486e-9402-af92cb3e1413] spec:map[containers:[map[image:k8s.gcr.io/serve_hostname imagePullPolicy:Always name:kubernetes-serve-hostname resources:map[limits:map[cpu:1 memory:512Mi] requests:map[cpu:1 memory:512Mi]] terminationMessagePath:/dev/termination-log terminationMessagePolicy:File]] dnsPolicy:ClusterFirst enableServiceLinks:true priority:0 restartPolicy:Always schedulerName:default-scheduler securityContext:map[] terminationGracePeriodSeconds:30] status:map[phase:Pending qosClass:Guaranteed]]
I0515 22:32:09.583] has:map has no entry for key "missing"
W0515 22:32:09.684] error: error executing template "{{.missing}}": template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing"
I0515 22:32:10.671] Successful
I0515 22:32:10.671] message:NAME        READY   STATUS    RESTARTS   AGE
I0515 22:32:10.671] valid-pod   0/1     Pending   0          1s
I0515 22:32:10.671] STATUS      REASON          MESSAGE
I0515 22:32:10.677] Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
I0515 22:32:10.678] has:STATUS
I0515 22:32:10.678] Successful
I0515 22:32:10.678] message:NAME        READY   STATUS    RESTARTS   AGE
I0515 22:32:10.678] valid-pod   0/1     Pending   0          1s
I0515 22:32:10.679] STATUS      REASON          MESSAGE
I0515 22:32:10.679] Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
I0515 22:32:10.679] has:valid-pod
I0515 22:32:11.774] Successful
I0515 22:32:11.774] message:pod/valid-pod
I0515 22:32:11.774] has not:STATUS
I0515 22:32:11.776] Successful
I0515 22:32:11.777] message:pod/valid-pod
... skipping 142 lines ...
I0515 22:32:12.880]   terminationGracePeriodSeconds: 30
I0515 22:32:12.880] status:
I0515 22:32:12.880]   phase: Pending
I0515 22:32:12.881]   qosClass: Guaranteed
I0515 22:32:12.881] has:name: valid-pod
I0515 22:32:12.960] Successful
I0515 22:32:12.961] message:Error from server (NotFound): pods "invalid-pod" not found
I0515 22:32:12.961] has:"invalid-pod" not found
I0515 22:32:13.041] pod "valid-pod" deleted
I0515 22:32:13.144] get.sh:200: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0515 22:32:13.348] (Bpod/redis-master created
I0515 22:32:13.352] pod/valid-pod created
I0515 22:32:13.470] Successful
... skipping 283 lines ...
I0515 22:32:19.227] +++ command: run_kubectl_exec_pod_tests
I0515 22:32:19.239] +++ [0515 22:32:19] Creating namespace namespace-1557959539-14561
I0515 22:32:19.314] namespace/namespace-1557959539-14561 created
I0515 22:32:19.386] Context "test" modified.
I0515 22:32:19.395] +++ [0515 22:32:19] Testing kubectl exec POD COMMAND
I0515 22:32:19.481] Successful
I0515 22:32:19.481] message:Error from server (NotFound): pods "abc" not found
I0515 22:32:19.481] has:pods "abc" not found
I0515 22:32:19.681] pod/test-pod created
I0515 22:32:19.822] Successful
I0515 22:32:19.823] message:Error from server (BadRequest): pod test-pod does not have a host assigned
I0515 22:32:19.823] has not:pods "test-pod" not found
I0515 22:32:19.825] Successful
I0515 22:32:19.825] message:Error from server (BadRequest): pod test-pod does not have a host assigned
I0515 22:32:19.825] has not:pod or type/name must be specified
I0515 22:32:19.909] pod "test-pod" deleted
I0515 22:32:19.934] +++ exit code: 0
I0515 22:32:19.975] Recording: run_kubectl_exec_resource_name_tests
I0515 22:32:19.975] Running command: run_kubectl_exec_resource_name_tests
I0515 22:32:19.998] 
... skipping 2 lines ...
I0515 22:32:20.005] +++ command: run_kubectl_exec_resource_name_tests
I0515 22:32:20.020] +++ [0515 22:32:20] Creating namespace namespace-1557959540-31417
I0515 22:32:20.098] namespace/namespace-1557959540-31417 created
I0515 22:32:20.181] Context "test" modified.
I0515 22:32:20.190] +++ [0515 22:32:20] Testing kubectl exec TYPE/NAME COMMAND
I0515 22:32:20.315] Successful
I0515 22:32:20.315] message:error: the server doesn't have a resource type "foo"
I0515 22:32:20.315] has:error:
I0515 22:32:20.410] Successful
I0515 22:32:20.410] message:Error from server (NotFound): deployments.extensions "bar" not found
I0515 22:32:20.410] has:"bar" not found
I0515 22:32:20.607] pod/test-pod created
I0515 22:32:20.832] replicaset.apps/frontend created
W0515 22:32:20.933] I0515 22:32:20.838455   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959540-31417", Name:"frontend", UID:"ed687c04-66a2-4ee3-aec2-3ca5ef942ef2", APIVersion:"apps/v1", ResourceVersion:"827", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-kzr72
W0515 22:32:20.933] I0515 22:32:20.844450   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959540-31417", Name:"frontend", UID:"ed687c04-66a2-4ee3-aec2-3ca5ef942ef2", APIVersion:"apps/v1", ResourceVersion:"827", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-jx2jk
W0515 22:32:20.933] I0515 22:32:20.844677   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959540-31417", Name:"frontend", UID:"ed687c04-66a2-4ee3-aec2-3ca5ef942ef2", APIVersion:"apps/v1", ResourceVersion:"827", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-ngrct
I0515 22:32:21.040] configmap/test-set-env-config created
I0515 22:32:21.145] Successful
I0515 22:32:21.145] message:error: cannot attach to *v1.ConfigMap: selector for *v1.ConfigMap not implemented
I0515 22:32:21.146] has:not implemented
I0515 22:32:21.239] Successful
I0515 22:32:21.239] message:Error from server (BadRequest): pod test-pod does not have a host assigned
I0515 22:32:21.239] has not:not found
I0515 22:32:21.241] Successful
I0515 22:32:21.241] message:Error from server (BadRequest): pod test-pod does not have a host assigned
I0515 22:32:21.241] has not:pod or type/name must be specified
I0515 22:32:21.345] Successful
I0515 22:32:21.345] message:Error from server (BadRequest): pod frontend-jx2jk does not have a host assigned
I0515 22:32:21.345] has not:not found
I0515 22:32:21.348] Successful
I0515 22:32:21.348] message:Error from server (BadRequest): pod frontend-jx2jk does not have a host assigned
I0515 22:32:21.348] has not:pod or type/name must be specified
I0515 22:32:21.425] pod "test-pod" deleted
I0515 22:32:21.511] replicaset.extensions "frontend" deleted
I0515 22:32:21.596] configmap "test-set-env-config" deleted
I0515 22:32:21.619] +++ exit code: 0
I0515 22:32:21.657] Recording: run_create_secret_tests
I0515 22:32:21.658] Running command: run_create_secret_tests
I0515 22:32:21.682] 
I0515 22:32:21.683] +++ Running case: test-cmd.run_create_secret_tests 
I0515 22:32:21.686] +++ working dir: /go/src/k8s.io/kubernetes
I0515 22:32:21.689] +++ command: run_create_secret_tests
I0515 22:32:21.786] Successful
I0515 22:32:21.786] message:Error from server (NotFound): secrets "mysecret" not found
I0515 22:32:21.786] has:secrets "mysecret" not found
I0515 22:32:21.946] Successful
I0515 22:32:21.947] message:Error from server (NotFound): secrets "mysecret" not found
I0515 22:32:21.947] has:secrets "mysecret" not found
I0515 22:32:21.949] Successful
I0515 22:32:21.949] message:user-specified
I0515 22:32:21.949] has:user-specified
I0515 22:32:22.021] Successful
I0515 22:32:22.096] {"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-create-cm","namespace":"default","selfLink":"/api/v1/namespaces/default/configmaps/tester-create-cm","uid":"006048d2-7508-4e85-ac4a-64d696548dea","resourceVersion":"848","creationTimestamp":"2019-05-15T22:32:22Z"}}
... skipping 164 lines ...
I0515 22:32:25.074] valid-pod   0/1     Pending   0          1s
I0515 22:32:25.074] has:valid-pod
I0515 22:32:26.173] Successful
I0515 22:32:26.173] message:NAME        READY   STATUS    RESTARTS   AGE
I0515 22:32:26.173] valid-pod   0/1     Pending   0          1s
I0515 22:32:26.173] STATUS      REASON          MESSAGE
I0515 22:32:26.173] Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
I0515 22:32:26.174] has:Timeout exceeded while reading body
I0515 22:32:26.261] Successful
I0515 22:32:26.262] message:NAME        READY   STATUS    RESTARTS   AGE
I0515 22:32:26.262] valid-pod   0/1     Pending   0          2s
I0515 22:32:26.262] has:valid-pod
I0515 22:32:26.337] Successful
I0515 22:32:26.337] message:error: Invalid timeout value. Timeout must be a single integer in seconds, or an integer followed by a corresponding time unit (e.g. 1s | 2m | 3h)
I0515 22:32:26.337] has:Invalid timeout value
I0515 22:32:26.419] pod "valid-pod" deleted
I0515 22:32:26.441] +++ exit code: 0
I0515 22:32:26.479] Recording: run_crd_tests
I0515 22:32:26.479] Running command: run_crd_tests
I0515 22:32:26.502] 
... skipping 237 lines ...
I0515 22:32:31.420] foo.company.com/test patched
I0515 22:32:31.526] crd.sh:237: Successful get foos/test {{.patched}}: value1
I0515 22:32:31.621] (Bfoo.company.com/test patched
I0515 22:32:31.732] crd.sh:239: Successful get foos/test {{.patched}}: value2
I0515 22:32:31.831] (Bfoo.company.com/test patched
I0515 22:32:31.939] crd.sh:241: Successful get foos/test {{.patched}}: <no value>
I0515 22:32:32.116] (B+++ [0515 22:32:32] "kubectl patch --local" returns error as expected for CustomResource: error: cannot apply strategic merge patch for company.com/v1, Kind=Foo locally, try --type merge
I0515 22:32:32.189] {
I0515 22:32:32.190]     "apiVersion": "company.com/v1",
I0515 22:32:32.190]     "kind": "Foo",
I0515 22:32:32.190]     "metadata": {
I0515 22:32:32.190]         "annotations": {
I0515 22:32:32.191]             "kubernetes.io/change-cause": "kubectl patch foos/test --server=http://127.0.0.1:8080 --match-server-version=true --patch={\"patched\":null} --type=merge --record=true"
... skipping 318 lines ...
I0515 22:32:49.562] (Bnamespace/non-native-resources created
I0515 22:32:49.787] bar.company.com/test created
I0515 22:32:49.909] crd.sh:456: Successful get bars {{len .items}}: 1
I0515 22:32:49.990] (Bnamespace "non-native-resources" deleted
I0515 22:32:55.239] crd.sh:459: Successful get bars {{len .items}}: 0
I0515 22:32:55.423] (Bcustomresourcedefinition.apiextensions.k8s.io "foos.company.com" deleted
W0515 22:32:55.525] Error from server (NotFound): namespaces "non-native-resources" not found
I0515 22:32:55.626] customresourcedefinition.apiextensions.k8s.io "bars.company.com" deleted
I0515 22:32:55.648] customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
I0515 22:32:55.766] customresourcedefinition.apiextensions.k8s.io "validfoos.company.com" deleted
I0515 22:32:55.809] +++ exit code: 0
I0515 22:32:55.886] Recording: run_cmd_with_img_tests
I0515 22:32:55.886] Running command: run_cmd_with_img_tests
... skipping 10 lines ...
W0515 22:32:56.204] I0515 22:32:56.200702   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959575-31480", Name:"test1-7b9c75bcb9", UID:"ff83ea8f-6bcf-4689-b2df-00afeff42d41", APIVersion:"apps/v1", ResourceVersion:"1005", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test1-7b9c75bcb9-tglhn
I0515 22:32:56.305] Successful
I0515 22:32:56.305] message:deployment.apps/test1 created
I0515 22:32:56.306] has:deployment.apps/test1 created
I0515 22:32:56.312] deployment.extensions "test1" deleted
I0515 22:32:56.403] Successful
I0515 22:32:56.403] message:error: Invalid image name "InvalidImageName": invalid reference format
I0515 22:32:56.403] has:error: Invalid image name "InvalidImageName": invalid reference format
I0515 22:32:56.416] +++ exit code: 0
I0515 22:32:56.458] +++ [0515 22:32:56] Testing recursive resources
I0515 22:32:56.465] +++ [0515 22:32:56] Creating namespace namespace-1557959576-5778
I0515 22:32:56.540] namespace/namespace-1557959576-5778 created
I0515 22:32:56.621] Context "test" modified.
I0515 22:32:56.734] generic-resources.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0515 22:32:57.055] (Bgeneric-resources.sh:206: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0515 22:32:57.057] (BSuccessful
I0515 22:32:57.058] message:pod/busybox0 created
I0515 22:32:57.058] pod/busybox1 created
I0515 22:32:57.058] error: error validating "hack/testdata/recursive/pod/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0515 22:32:57.058] has:error validating data: kind not set
I0515 22:32:57.159] generic-resources.sh:211: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0515 22:32:57.353] (Bgeneric-resources.sh:219: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: busybox:busybox:
I0515 22:32:57.355] (BSuccessful
I0515 22:32:57.356] message:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0515 22:32:57.356] has:Object 'Kind' is missing
I0515 22:32:57.461] generic-resources.sh:226: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0515 22:32:57.803] (Bgeneric-resources.sh:230: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
I0515 22:32:57.806] (BSuccessful
I0515 22:32:57.806] message:pod/busybox0 replaced
I0515 22:32:57.807] pod/busybox1 replaced
I0515 22:32:57.807] error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0515 22:32:57.807] has:error validating data: kind not set
I0515 22:32:57.909] generic-resources.sh:235: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0515 22:32:58.017] (BSuccessful
I0515 22:32:58.017] message:Name:         busybox0
I0515 22:32:58.017] Namespace:    namespace-1557959576-5778
I0515 22:32:58.018] Priority:     0
I0515 22:32:58.018] Node:         <none>
... skipping 153 lines ...
I0515 22:32:58.044] has:Object 'Kind' is missing
I0515 22:32:58.133] generic-resources.sh:245: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0515 22:32:58.335] (Bgeneric-resources.sh:249: Successful get pods {{range.items}}{{.metadata.annotations.annotatekey}}:{{end}}: annotatevalue:annotatevalue:
I0515 22:32:58.337] (BSuccessful
I0515 22:32:58.337] message:pod/busybox0 annotated
I0515 22:32:58.338] pod/busybox1 annotated
I0515 22:32:58.338] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0515 22:32:58.338] has:Object 'Kind' is missing
I0515 22:32:58.439] generic-resources.sh:254: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0515 22:32:58.770] (Bgeneric-resources.sh:258: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
I0515 22:32:58.772] (BSuccessful
I0515 22:32:58.772] message:Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
I0515 22:32:58.773] pod/busybox0 configured
I0515 22:32:58.773] Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
I0515 22:32:58.773] pod/busybox1 configured
I0515 22:32:58.773] error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0515 22:32:58.773] has:error validating data: kind not set
I0515 22:32:58.872] generic-resources.sh:264: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I0515 22:32:59.069] (Bdeployment.apps/nginx created
W0515 22:32:59.170] I0515 22:32:59.078744   51154 event.go:258] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1557959576-5778", Name:"nginx", UID:"e60ab409-759a-440f-8f12-813ad70be727", APIVersion:"apps/v1", ResourceVersion:"1029", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-958dc566b to 3
W0515 22:32:59.171] I0515 22:32:59.085192   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959576-5778", Name:"nginx-958dc566b", UID:"4ec96d14-4aa1-4db9-9c0e-9ad114cf272a", APIVersion:"apps/v1", ResourceVersion:"1030", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-958dc566b-jlgmb
W0515 22:32:59.171] I0515 22:32:59.091185   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959576-5778", Name:"nginx-958dc566b", UID:"4ec96d14-4aa1-4db9-9c0e-9ad114cf272a", APIVersion:"apps/v1", ResourceVersion:"1030", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-958dc566b-lmcdg
W0515 22:32:59.171] I0515 22:32:59.092373   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959576-5778", Name:"nginx-958dc566b", UID:"4ec96d14-4aa1-4db9-9c0e-9ad114cf272a", APIVersion:"apps/v1", ResourceVersion:"1030", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-958dc566b-k4hb8
... skipping 48 lines ...
W0515 22:32:59.677] In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
I0515 22:32:59.778] generic-resources.sh:280: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0515 22:32:59.887] (Bgeneric-resources.sh:284: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0515 22:32:59.889] (BSuccessful
I0515 22:32:59.890] message:kubectl convert is DEPRECATED and will be removed in a future version.
I0515 22:32:59.890] In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
I0515 22:32:59.890] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0515 22:32:59.890] has:Object 'Kind' is missing
I0515 22:32:59.997] generic-resources.sh:289: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0515 22:33:00.100] (BSuccessful
I0515 22:33:00.100] message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0515 22:33:00.101] has:busybox0:busybox1:
I0515 22:33:00.102] Successful
I0515 22:33:00.103] message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0515 22:33:00.103] has:Object 'Kind' is missing
W0515 22:33:00.203] I0515 22:33:00.123652   51154 namespace_controller.go:171] Namespace has been deleted non-native-resources
I0515 22:33:00.304] generic-resources.sh:298: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0515 22:33:00.325] (Bpod/busybox0 labeled pod/busybox1 labeled error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0515 22:33:00.432] generic-resources.sh:303: Successful get pods {{range.items}}{{.metadata.labels.mylabel}}:{{end}}: myvalue:myvalue:
I0515 22:33:00.434] (BSuccessful
I0515 22:33:00.434] message:pod/busybox0 labeled
I0515 22:33:00.435] pod/busybox1 labeled
I0515 22:33:00.435] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0515 22:33:00.435] has:Object 'Kind' is missing
I0515 22:33:00.537] generic-resources.sh:308: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0515 22:33:00.632] (Bpod/busybox0 patched pod/busybox1 patched error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0515 22:33:00.743] generic-resources.sh:313: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: prom/busybox:prom/busybox:
I0515 22:33:00.745] (BSuccessful
I0515 22:33:00.745] message:pod/busybox0 patched
I0515 22:33:00.746] pod/busybox1 patched
I0515 22:33:00.746] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0515 22:33:00.746] has:Object 'Kind' is missing
I0515 22:33:00.850] generic-resources.sh:318: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0515 22:33:01.048] (Bgeneric-resources.sh:322: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0515 22:33:01.050] (BSuccessful
I0515 22:33:01.051] message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
I0515 22:33:01.051] pod "busybox0" force deleted
I0515 22:33:01.051] pod "busybox1" force deleted
I0515 22:33:01.051] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0515 22:33:01.052] has:Object 'Kind' is missing
I0515 22:33:01.146] generic-resources.sh:327: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I0515 22:33:01.339] (Breplicationcontroller/busybox0 created
I0515 22:33:01.343] replicationcontroller/busybox1 created
W0515 22:33:01.444] error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
W0515 22:33:01.445] I0515 22:33:01.343788   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1557959576-5778", Name:"busybox0", UID:"807e662d-161c-44cc-9992-8bd6e5d318a3", APIVersion:"v1", ResourceVersion:"1060", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-vfcmw
W0515 22:33:01.445] I0515 22:33:01.348278   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1557959576-5778", Name:"busybox1", UID:"f0dca415-f6d2-4025-ae23-f63f172a3cb3", APIVersion:"v1", ResourceVersion:"1062", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-28tls
I0515 22:33:01.546] generic-resources.sh:331: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0515 22:33:01.559] (Bgeneric-resources.sh:336: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0515 22:33:01.660] (Bgeneric-resources.sh:337: Successful get rc busybox0 {{.spec.replicas}}: 1
I0515 22:33:01.764] (Bgeneric-resources.sh:338: Successful get rc busybox1 {{.spec.replicas}}: 1
I0515 22:33:01.955] (Bgeneric-resources.sh:343: Successful get hpa busybox0 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
I0515 22:33:02.051] (Bgeneric-resources.sh:344: Successful get hpa busybox1 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
I0515 22:33:02.053] (BSuccessful
I0515 22:33:02.053] message:horizontalpodautoscaler.autoscaling/busybox0 autoscaled
I0515 22:33:02.053] horizontalpodautoscaler.autoscaling/busybox1 autoscaled
I0515 22:33:02.054] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0515 22:33:02.054] has:Object 'Kind' is missing
I0515 22:33:02.133] horizontalpodautoscaler.autoscaling "busybox0" deleted
I0515 22:33:02.220] horizontalpodautoscaler.autoscaling "busybox1" deleted
I0515 22:33:02.323] generic-resources.sh:352: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0515 22:33:02.426] (Bgeneric-resources.sh:353: Successful get rc busybox0 {{.spec.replicas}}: 1
I0515 22:33:02.523] (Bgeneric-resources.sh:354: Successful get rc busybox1 {{.spec.replicas}}: 1
I0515 22:33:02.731] (Bgeneric-resources.sh:358: Successful get service busybox0 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
I0515 22:33:02.828] (Bgeneric-resources.sh:359: Successful get service busybox1 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
I0515 22:33:02.830] (BSuccessful
I0515 22:33:02.831] message:service/busybox0 exposed
I0515 22:33:02.831] service/busybox1 exposed
I0515 22:33:02.831] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0515 22:33:02.831] has:Object 'Kind' is missing
I0515 22:33:02.928] generic-resources.sh:365: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0515 22:33:03.024] (Bgeneric-resources.sh:366: Successful get rc busybox0 {{.spec.replicas}}: 1
I0515 22:33:03.123] (Bgeneric-resources.sh:367: Successful get rc busybox1 {{.spec.replicas}}: 1
I0515 22:33:03.339] (Bgeneric-resources.sh:371: Successful get rc busybox0 {{.spec.replicas}}: 2
I0515 22:33:03.436] (Bgeneric-resources.sh:372: Successful get rc busybox1 {{.spec.replicas}}: 2
I0515 22:33:03.439] (BSuccessful
I0515 22:33:03.439] message:replicationcontroller/busybox0 scaled
I0515 22:33:03.439] replicationcontroller/busybox1 scaled
I0515 22:33:03.439] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0515 22:33:03.439] has:Object 'Kind' is missing
I0515 22:33:03.538] generic-resources.sh:377: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0515 22:33:03.730] (Bgeneric-resources.sh:381: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I0515 22:33:03.733] (BSuccessful
I0515 22:33:03.733] message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
I0515 22:33:03.733] replicationcontroller "busybox0" force deleted
I0515 22:33:03.734] replicationcontroller "busybox1" force deleted
I0515 22:33:03.734] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0515 22:33:03.734] has:Object 'Kind' is missing
I0515 22:33:03.830] generic-resources.sh:386: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I0515 22:33:04.034] (Bdeployment.apps/nginx1-deployment created
I0515 22:33:04.039] deployment.apps/nginx0-deployment created
W0515 22:33:04.139] I0515 22:33:03.226640   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1557959576-5778", Name:"busybox0", UID:"807e662d-161c-44cc-9992-8bd6e5d318a3", APIVersion:"v1", ResourceVersion:"1082", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-nzqsx
W0515 22:33:04.140] I0515 22:33:03.235377   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1557959576-5778", Name:"busybox1", UID:"f0dca415-f6d2-4025-ae23-f63f172a3cb3", APIVersion:"v1", ResourceVersion:"1085", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-59vkz
W0515 22:33:04.140] error: error validating "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
W0515 22:33:04.141] I0515 22:33:04.041016   51154 event.go:258] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1557959576-5778", Name:"nginx1-deployment", UID:"d10e9751-fc0b-4bc1-a315-958506297fcb", APIVersion:"apps/v1", ResourceVersion:"1103", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx1-deployment-67c99bcc6b to 2
W0515 22:33:04.141] I0515 22:33:04.044995   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959576-5778", Name:"nginx1-deployment-67c99bcc6b", UID:"023d7a5b-0dfc-4f54-9b20-6bc1925e8c7c", APIVersion:"apps/v1", ResourceVersion:"1105", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-67c99bcc6b-zltxx
W0515 22:33:04.141] I0515 22:33:04.046150   51154 event.go:258] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1557959576-5778", Name:"nginx0-deployment", UID:"8ff4964d-4cc3-4991-a28a-6262e7db3b05", APIVersion:"apps/v1", ResourceVersion:"1104", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx0-deployment-5886cf98fc to 2
W0515 22:33:04.142] I0515 22:33:04.050326   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959576-5778", Name:"nginx1-deployment-67c99bcc6b", UID:"023d7a5b-0dfc-4f54-9b20-6bc1925e8c7c", APIVersion:"apps/v1", ResourceVersion:"1105", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-67c99bcc6b-dm9pq
W0515 22:33:04.142] I0515 22:33:04.053088   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959576-5778", Name:"nginx0-deployment-5886cf98fc", UID:"a3fcbcc4-c033-4333-8c3c-b035f5794734", APIVersion:"apps/v1", ResourceVersion:"1106", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-5886cf98fc-2wzq9
W0515 22:33:04.142] I0515 22:33:04.058710   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959576-5778", Name:"nginx0-deployment-5886cf98fc", UID:"a3fcbcc4-c033-4333-8c3c-b035f5794734", APIVersion:"apps/v1", ResourceVersion:"1106", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-5886cf98fc-9mcm7
I0515 22:33:04.243] generic-resources.sh:390: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx0-deployment:nginx1-deployment:
I0515 22:33:04.282] (Bgeneric-resources.sh:391: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
I0515 22:33:04.500] (Bgeneric-resources.sh:395: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
I0515 22:33:04.503] (BSuccessful
I0515 22:33:04.503] message:deployment.apps/nginx1-deployment skipped rollback (current template already matches revision 1)
I0515 22:33:04.504] deployment.apps/nginx0-deployment skipped rollback (current template already matches revision 1)
I0515 22:33:04.504] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I0515 22:33:04.504] has:Object 'Kind' is missing
I0515 22:33:04.609] deployment.apps/nginx1-deployment paused
I0515 22:33:04.617] deployment.apps/nginx0-deployment paused
I0515 22:33:04.739] generic-resources.sh:402: Successful get deployment {{range.items}}{{.spec.paused}}:{{end}}: true:true:
I0515 22:33:04.741] (BSuccessful
I0515 22:33:04.741] message:unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
... skipping 10 lines ...
I0515 22:33:05.100] 1         <none>
I0515 22:33:05.100] 
I0515 22:33:05.101] deployment.apps/nginx0-deployment 
I0515 22:33:05.101] REVISION  CHANGE-CAUSE
I0515 22:33:05.101] 1         <none>
I0515 22:33:05.101] 
I0515 22:33:05.101] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I0515 22:33:05.101] has:nginx0-deployment
I0515 22:33:05.103] Successful
I0515 22:33:05.103] message:deployment.apps/nginx1-deployment 
I0515 22:33:05.103] REVISION  CHANGE-CAUSE
I0515 22:33:05.103] 1         <none>
I0515 22:33:05.103] 
I0515 22:33:05.104] deployment.apps/nginx0-deployment 
I0515 22:33:05.104] REVISION  CHANGE-CAUSE
I0515 22:33:05.104] 1         <none>
I0515 22:33:05.104] 
I0515 22:33:05.104] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I0515 22:33:05.104] has:nginx1-deployment
I0515 22:33:05.106] Successful
I0515 22:33:05.106] message:deployment.apps/nginx1-deployment 
I0515 22:33:05.106] REVISION  CHANGE-CAUSE
I0515 22:33:05.107] 1         <none>
I0515 22:33:05.107] 
I0515 22:33:05.107] deployment.apps/nginx0-deployment 
I0515 22:33:05.107] REVISION  CHANGE-CAUSE
I0515 22:33:05.107] 1         <none>
I0515 22:33:05.107] 
I0515 22:33:05.108] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I0515 22:33:05.108] has:Object 'Kind' is missing
I0515 22:33:05.191] deployment.apps "nginx1-deployment" force deleted
I0515 22:33:05.196] deployment.apps "nginx0-deployment" force deleted
W0515 22:33:05.296] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0515 22:33:05.297] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I0515 22:33:06.308] generic-resources.sh:424: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I0515 22:33:06.502] (Breplicationcontroller/busybox0 created
I0515 22:33:06.507] replicationcontroller/busybox1 created
W0515 22:33:06.607] error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
W0515 22:33:06.608] I0515 22:33:06.507395   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1557959576-5778", Name:"busybox0", UID:"6ee4b8a3-6fe9-43a8-a578-16bc2034cfd8", APIVersion:"v1", ResourceVersion:"1152", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-4srm4
W0515 22:33:06.608] I0515 22:33:06.512543   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1557959576-5778", Name:"busybox1", UID:"9c8a3bb8-24b9-47b2-bfca-32756d5bb0b9", APIVersion:"v1", ResourceVersion:"1154", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-6l4wd
I0515 22:33:06.709] generic-resources.sh:428: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0515 22:33:06.726] (BSuccessful
I0515 22:33:06.726] message:no rollbacker has been implemented for "ReplicationController"
I0515 22:33:06.726] no rollbacker has been implemented for "ReplicationController"
... skipping 3 lines ...
I0515 22:33:06.728] message:no rollbacker has been implemented for "ReplicationController"
I0515 22:33:06.728] no rollbacker has been implemented for "ReplicationController"
I0515 22:33:06.729] unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0515 22:33:06.729] has:Object 'Kind' is missing
I0515 22:33:06.836] Successful
I0515 22:33:06.837] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0515 22:33:06.837] error: replicationcontrollers "busybox0" pausing is not supported
I0515 22:33:06.837] error: replicationcontrollers "busybox1" pausing is not supported
I0515 22:33:06.837] has:Object 'Kind' is missing
I0515 22:33:06.839] Successful
I0515 22:33:06.839] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0515 22:33:06.840] error: replicationcontrollers "busybox0" pausing is not supported
I0515 22:33:06.840] error: replicationcontrollers "busybox1" pausing is not supported
I0515 22:33:06.840] has:replicationcontrollers "busybox0" pausing is not supported
I0515 22:33:06.842] Successful
I0515 22:33:06.842] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0515 22:33:06.842] error: replicationcontrollers "busybox0" pausing is not supported
I0515 22:33:06.843] error: replicationcontrollers "busybox1" pausing is not supported
I0515 22:33:06.843] has:replicationcontrollers "busybox1" pausing is not supported
I0515 22:33:06.944] Successful
I0515 22:33:06.945] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0515 22:33:06.945] error: replicationcontrollers "busybox0" resuming is not supported
I0515 22:33:06.945] error: replicationcontrollers "busybox1" resuming is not supported
I0515 22:33:06.945] has:Object 'Kind' is missing
I0515 22:33:06.946] Successful
I0515 22:33:06.947] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0515 22:33:06.947] error: replicationcontrollers "busybox0" resuming is not supported
I0515 22:33:06.947] error: replicationcontrollers "busybox1" resuming is not supported
I0515 22:33:06.948] has:replicationcontrollers "busybox0" resuming is not supported
I0515 22:33:06.949] Successful
I0515 22:33:06.950] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0515 22:33:06.950] error: replicationcontrollers "busybox0" resuming is not supported
I0515 22:33:06.950] error: replicationcontrollers "busybox1" resuming is not supported
I0515 22:33:06.950] has:replicationcontrollers "busybox0" resuming is not supported
I0515 22:33:07.033] replicationcontroller "busybox0" force deleted
I0515 22:33:07.039] replicationcontroller "busybox1" force deleted
W0515 22:33:07.139] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0515 22:33:07.140] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0515 22:33:08.049] Recording: run_namespace_tests
I0515 22:33:08.050] Running command: run_namespace_tests
I0515 22:33:08.073] 
I0515 22:33:08.076] +++ Running case: test-cmd.run_namespace_tests 
I0515 22:33:08.078] +++ working dir: /go/src/k8s.io/kubernetes
I0515 22:33:08.082] +++ command: run_namespace_tests
... skipping 4 lines ...
W0515 22:33:12.570] I0515 22:33:12.569619   51154 controller_utils.go:1029] Waiting for caches to sync for resource quota controller
W0515 22:33:12.670] I0515 22:33:12.669967   51154 controller_utils.go:1036] Caches are synced for resource quota controller
W0515 22:33:13.105] I0515 22:33:13.104689   51154 controller_utils.go:1029] Waiting for caches to sync for garbage collector controller
W0515 22:33:13.205] I0515 22:33:13.205030   51154 controller_utils.go:1036] Caches are synced for garbage collector controller
I0515 22:33:13.463] namespace/my-namespace condition met
I0515 22:33:13.557] Successful
I0515 22:33:13.558] message:Error from server (NotFound): namespaces "my-namespace" not found
I0515 22:33:13.558] has: not found
I0515 22:33:13.630] namespace/my-namespace created
I0515 22:33:13.742] core.sh:1330: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
I0515 22:33:13.945] (BSuccessful
I0515 22:33:13.946] message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
I0515 22:33:13.946] namespace "kube-node-lease" deleted
... skipping 30 lines ...
I0515 22:33:13.949] namespace "namespace-1557959543-6882" deleted
I0515 22:33:13.949] namespace "namespace-1557959544-6966" deleted
I0515 22:33:13.949] namespace "namespace-1557959546-22124" deleted
I0515 22:33:13.950] namespace "namespace-1557959547-30050" deleted
I0515 22:33:13.950] namespace "namespace-1557959575-31480" deleted
I0515 22:33:13.950] namespace "namespace-1557959576-5778" deleted
I0515 22:33:13.950] Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
I0515 22:33:13.950] Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
I0515 22:33:13.950] Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
I0515 22:33:13.950] has:warning: deleting cluster-scoped resources
I0515 22:33:13.950] Successful
I0515 22:33:13.951] message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
I0515 22:33:13.951] namespace "kube-node-lease" deleted
I0515 22:33:13.951] namespace "my-namespace" deleted
I0515 22:33:13.951] namespace "namespace-1557959441-12506" deleted
... skipping 28 lines ...
I0515 22:33:13.954] namespace "namespace-1557959543-6882" deleted
I0515 22:33:13.954] namespace "namespace-1557959544-6966" deleted
I0515 22:33:13.954] namespace "namespace-1557959546-22124" deleted
I0515 22:33:13.954] namespace "namespace-1557959547-30050" deleted
I0515 22:33:13.955] namespace "namespace-1557959575-31480" deleted
I0515 22:33:13.955] namespace "namespace-1557959576-5778" deleted
I0515 22:33:13.955] Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
I0515 22:33:13.955] Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
I0515 22:33:13.955] Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
I0515 22:33:13.955] has:namespace "my-namespace" deleted
I0515 22:33:14.063] core.sh:1342: Successful get namespaces {{range.items}}{{ if eq $id_field \"other\" }}found{{end}}{{end}}:: :
I0515 22:33:14.136] (Bnamespace/other created
I0515 22:33:14.235] core.sh:1346: Successful get namespaces/other {{.metadata.name}}: other
I0515 22:33:14.327] (Bcore.sh:1350: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
I0515 22:33:14.517] (Bpod/valid-pod created
I0515 22:33:14.631] core.sh:1354: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0515 22:33:14.732] (Bcore.sh:1356: Successful get pods -n other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0515 22:33:14.829] (BSuccessful
I0515 22:33:14.830] message:error: a resource cannot be retrieved by name across all namespaces
I0515 22:33:14.830] has:a resource cannot be retrieved by name across all namespaces
I0515 22:33:14.936] core.sh:1363: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0515 22:33:15.017] (Bpod "valid-pod" force deleted
I0515 22:33:15.119] core.sh:1367: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
I0515 22:33:15.197] (Bnamespace "other" deleted
W0515 22:33:15.298] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
... skipping 151 lines ...
I0515 22:33:35.887] +++ command: run_client_config_tests
I0515 22:33:35.900] +++ [0515 22:33:35] Creating namespace namespace-1557959615-32721
I0515 22:33:35.974] namespace/namespace-1557959615-32721 created
I0515 22:33:36.051] Context "test" modified.
I0515 22:33:36.061] +++ [0515 22:33:36] Testing client config
I0515 22:33:36.136] Successful
I0515 22:33:36.137] message:error: stat missing: no such file or directory
I0515 22:33:36.137] has:missing: no such file or directory
I0515 22:33:36.213] Successful
I0515 22:33:36.213] message:error: stat missing: no such file or directory
I0515 22:33:36.213] has:missing: no such file or directory
I0515 22:33:36.290] Successful
I0515 22:33:36.290] message:error: stat missing: no such file or directory
I0515 22:33:36.290] has:missing: no such file or directory
I0515 22:33:36.370] Successful
I0515 22:33:36.371] message:Error in configuration: context was not found for specified context: missing-context
I0515 22:33:36.371] has:context was not found for specified context: missing-context
I0515 22:33:36.450] Successful
I0515 22:33:36.450] message:error: no server found for cluster "missing-cluster"
I0515 22:33:36.451] has:no server found for cluster "missing-cluster"
I0515 22:33:36.528] Successful
I0515 22:33:36.528] message:error: auth info "missing-user" does not exist
I0515 22:33:36.528] has:auth info "missing-user" does not exist
I0515 22:33:36.683] Successful
I0515 22:33:36.683] message:error: Error loading config file "/tmp/newconfig.yaml": no kind "Config" is registered for version "v-1" in scheme "k8s.io/client-go/tools/clientcmd/api/latest/latest.go:50"
I0515 22:33:36.684] has:Error loading config file
I0515 22:33:36.765] Successful
I0515 22:33:36.766] message:error: stat missing-config: no such file or directory
I0515 22:33:36.766] has:no such file or directory
I0515 22:33:36.785] +++ exit code: 0
I0515 22:33:36.829] Recording: run_service_accounts_tests
I0515 22:33:36.830] Running command: run_service_accounts_tests
I0515 22:33:36.855] 
I0515 22:33:36.858] +++ Running case: test-cmd.run_service_accounts_tests 
... skipping 34 lines ...
I0515 22:33:43.732] Labels:                        run=pi
I0515 22:33:43.732] Annotations:                   <none>
I0515 22:33:43.732] Schedule:                      59 23 31 2 *
I0515 22:33:43.733] Concurrency Policy:            Allow
I0515 22:33:43.733] Suspend:                       False
I0515 22:33:43.733] Successful Job History Limit:  3
I0515 22:33:43.733] Failed Job History Limit:      1
I0515 22:33:43.733] Starting Deadline Seconds:     <unset>
I0515 22:33:43.734] Selector:                      <unset>
I0515 22:33:43.734] Parallelism:                   <unset>
I0515 22:33:43.734] Completions:                   <unset>
I0515 22:33:43.734] Pod Template:
I0515 22:33:43.734]   Labels:  run=pi
... skipping 32 lines ...
I0515 22:33:44.298]                 run=pi
I0515 22:33:44.298] Annotations:    cronjob.kubernetes.io/instantiate: manual
I0515 22:33:44.298] Controlled By:  CronJob/pi
I0515 22:33:44.298] Parallelism:    1
I0515 22:33:44.298] Completions:    1
I0515 22:33:44.299] Start Time:     Wed, 15 May 2019 22:33:44 +0000
I0515 22:33:44.299] Pods Statuses:  1 Running / 0 Succeeded / 0 Failed
I0515 22:33:44.299] Pod Template:
I0515 22:33:44.299]   Labels:  controller-uid=51db6bf0-b0c8-4df8-80ca-0cd20ec1b8cb
I0515 22:33:44.299]            job-name=test-job
I0515 22:33:44.299]            run=pi
I0515 22:33:44.299]   Containers:
I0515 22:33:44.299]    pi:
... skipping 389 lines ...
I0515 22:33:54.357]   selector:
I0515 22:33:54.358]     role: padawan
I0515 22:33:54.358]   sessionAffinity: None
I0515 22:33:54.358]   type: ClusterIP
I0515 22:33:54.358] status:
I0515 22:33:54.358]   loadBalancer: {}
W0515 22:33:54.459] error: you must specify resources by --filename when --local is set.
W0515 22:33:54.459] Example resource specifications include:
W0515 22:33:54.459]    '-f rsrc.yaml'
W0515 22:33:54.459]    '--filename=rsrc.json'
I0515 22:33:54.560] core.sh:886: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend:
I0515 22:33:54.704] (Bcore.sh:893: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
I0515 22:33:54.794] (Bservice "redis-master" deleted
... skipping 105 lines ...
I0515 22:34:02.208]   Volumes:	<none>
I0515 22:34:02.208]  (dry run)
I0515 22:34:02.308] apps.sh:79: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
I0515 22:34:02.405] (Bapps.sh:80: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0515 22:34:02.505] (Bapps.sh:81: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
I0515 22:34:02.619] (Bdaemonset.extensions/bind rolled back
W0515 22:34:02.722] E0515 22:34:02.645270   51154 daemon_controller.go:302] namespace-1557959640-24853/bind failed with : error storing status for daemon set &v1.DaemonSet{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"bind", GenerateName:"", Namespace:"namespace-1557959640-24853", SelfLink:"/apis/apps/v1/namespaces/namespace-1557959640-24853/daemonsets/bind", UID:"14581ac5-45d9-47f8-82a8-e24b540e656e", ResourceVersion:"1635", Generation:3, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63693556441, loc:(*time.Location)(0x72a78a0)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string{"deprecated.daemonset.template.generation":"3", "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"DaemonSet\",\"metadata\":{\"annotations\":{\"kubernetes.io/change-cause\":\"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true\"},\"labels\":{\"service\":\"bind\"},\"name\":\"bind\",\"namespace\":\"namespace-1557959640-24853\"},\"spec\":{\"selector\":{\"matchLabels\":{\"service\":\"bind\"}},\"template\":{\"metadata\":{\"labels\":{\"service\":\"bind\"}},\"spec\":{\"affinity\":{\"podAntiAffinity\":{\"requiredDuringSchedulingIgnoredDuringExecution\":[{\"labelSelector\":{\"matchExpressions\":[{\"key\":\"service\",\"operator\":\"In\",\"values\":[\"bind\"]}]},\"namespaces\":[],\"topologyKey\":\"kubernetes.io/hostname\"}]}},\"containers\":[{\"image\":\"k8s.gcr.io/pause:latest\",\"name\":\"kubernetes-pause\"},{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"app\"}]}},\"updateStrategy\":{\"rollingUpdate\":{\"maxUnavailable\":\"10%\"},\"type\":\"RollingUpdate\"}}}\n", "kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"}, OwnerReferences:[]v1.OwnerReference(nil), Initializers:(*v1.Initializers)(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"kube-controller-manager", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc0009f33a0), Fields:(*v1.Fields)(0xc000b5ece0)}, v1.ManagedFieldsEntry{Manager:"kubectl", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc0009f3a80), Fields:(*v1.Fields)(0xc000b5ed38)}, v1.ManagedFieldsEntry{Manager:"kubectl", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc001690640), Fields:(*v1.Fields)(0xc000b5edd8)}}}, Spec:v1.DaemonSetSpec{Selector:(*v1.LabelSelector)(0xc001690740), Template:v1.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Initializers:(*v1.Initializers)(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"kubernetes-pause", Image:"k8s.gcr.io/pause:2.0", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc002cb9e08), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc0025dd440), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(0xc001690760), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration(nil), HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(nil)}}, UpdateStrategy:v1.DaemonSetUpdateStrategy{Type:"RollingUpdate", RollingUpdate:(*v1.RollingUpdateDaemonSet)(0xc000b5ee38)}, MinReadySeconds:0, RevisionHistoryLimit:(*int32)(0xc002cb9e80)}, Status:v1.DaemonSetStatus{CurrentNumberScheduled:0, NumberMisscheduled:0, DesiredNumberScheduled:0, NumberReady:0, ObservedGeneration:2, UpdatedNumberScheduled:0, NumberAvailable:0, NumberUnavailable:0, CollisionCount:(*int32)(nil), Conditions:[]v1.DaemonSetCondition(nil)}}: Operation cannot be fulfilled on daemonsets.apps "bind": the object has been modified; please apply your changes to the latest version and try again
I0515 22:34:02.823] apps.sh:84: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
I0515 22:34:02.858] (Bapps.sh:85: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I0515 22:34:02.984] (BSuccessful
I0515 22:34:02.984] message:error: unable to find specified revision 1000000 in history
I0515 22:34:02.984] has:unable to find specified revision
I0515 22:34:03.080] apps.sh:89: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
I0515 22:34:03.182] (Bapps.sh:90: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I0515 22:34:03.295] (Bdaemonset.extensions/bind rolled back
W0515 22:34:03.399] E0515 22:34:03.325088   51154 daemon_controller.go:302] namespace-1557959640-24853/bind failed with : error storing status for daemon set &v1.DaemonSet{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"bind", GenerateName:"", Namespace:"namespace-1557959640-24853", SelfLink:"/apis/apps/v1/namespaces/namespace-1557959640-24853/daemonsets/bind", UID:"14581ac5-45d9-47f8-82a8-e24b540e656e", ResourceVersion:"1639", Generation:4, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63693556441, loc:(*time.Location)(0x72a78a0)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string{"deprecated.daemonset.template.generation":"4", "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"DaemonSet\",\"metadata\":{\"annotations\":{\"kubernetes.io/change-cause\":\"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true\"},\"labels\":{\"service\":\"bind\"},\"name\":\"bind\",\"namespace\":\"namespace-1557959640-24853\"},\"spec\":{\"selector\":{\"matchLabels\":{\"service\":\"bind\"}},\"template\":{\"metadata\":{\"labels\":{\"service\":\"bind\"}},\"spec\":{\"affinity\":{\"podAntiAffinity\":{\"requiredDuringSchedulingIgnoredDuringExecution\":[{\"labelSelector\":{\"matchExpressions\":[{\"key\":\"service\",\"operator\":\"In\",\"values\":[\"bind\"]}]},\"namespaces\":[],\"topologyKey\":\"kubernetes.io/hostname\"}]}},\"containers\":[{\"image\":\"k8s.gcr.io/pause:latest\",\"name\":\"kubernetes-pause\"},{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"app\"}]}},\"updateStrategy\":{\"rollingUpdate\":{\"maxUnavailable\":\"10%\"},\"type\":\"RollingUpdate\"}}}\n", "kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"}, OwnerReferences:[]v1.OwnerReference(nil), Initializers:(*v1.Initializers)(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"kube-controller-manager", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc001b04cc0), Fields:(*v1.Fields)(0xc001c95258)}, v1.ManagedFieldsEntry{Manager:"kubectl", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc001b04e40), Fields:(*v1.Fields)(0xc001c952a0)}, v1.ManagedFieldsEntry{Manager:"kube-controller-manager", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc001b05740), Fields:(*v1.Fields)(0xc001c95338)}, v1.ManagedFieldsEntry{Manager:"kubectl", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc001b058a0), Fields:(*v1.Fields)(0xc001c95360)}}}, Spec:v1.DaemonSetSpec{Selector:(*v1.LabelSelector)(0xc001b05b80), Template:v1.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Initializers:(*v1.Initializers)(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"kubernetes-pause", Image:"k8s.gcr.io/pause:latest", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}, v1.Container{Name:"app", Image:"k8s.gcr.io/nginx:test-cmd", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc0011ad378), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc002487da0), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(0xc001b05be0), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration(nil), HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(nil)}}, UpdateStrategy:v1.DaemonSetUpdateStrategy{Type:"RollingUpdate", RollingUpdate:(*v1.RollingUpdateDaemonSet)(0xc001c953c0)}, MinReadySeconds:0, RevisionHistoryLimit:(*int32)(0xc0011ad3f0)}, Status:v1.DaemonSetStatus{CurrentNumberScheduled:0, NumberMisscheduled:0, DesiredNumberScheduled:0, NumberReady:0, ObservedGeneration:3, UpdatedNumberScheduled:0, NumberAvailable:0, NumberUnavailable:0, CollisionCount:(*int32)(nil), Conditions:[]v1.DaemonSetCondition(nil)}}: Operation cannot be fulfilled on daemonsets.apps "bind": the object has been modified; please apply your changes to the latest version and try again
I0515 22:34:03.499] apps.sh:93: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
I0515 22:34:03.508] (Bapps.sh:94: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0515 22:34:03.610] (Bapps.sh:95: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
I0515 22:34:03.697] (Bdaemonset.apps "bind" deleted
I0515 22:34:03.723] +++ exit code: 0
I0515 22:34:03.765] Recording: run_rc_tests
... skipping 24 lines ...
I0515 22:34:05.061] Namespace:    namespace-1557959643-10108
I0515 22:34:05.061] Selector:     app=guestbook,tier=frontend
I0515 22:34:05.062] Labels:       app=guestbook
I0515 22:34:05.062]               tier=frontend
I0515 22:34:05.062] Annotations:  <none>
I0515 22:34:05.062] Replicas:     3 current / 3 desired
I0515 22:34:05.062] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0515 22:34:05.062] Pod Template:
I0515 22:34:05.062]   Labels:  app=guestbook
I0515 22:34:05.063]            tier=frontend
I0515 22:34:05.063]   Containers:
I0515 22:34:05.063]    php-redis:
I0515 22:34:05.063]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
I0515 22:34:05.179] Namespace:    namespace-1557959643-10108
I0515 22:34:05.179] Selector:     app=guestbook,tier=frontend
I0515 22:34:05.179] Labels:       app=guestbook
I0515 22:34:05.180]               tier=frontend
I0515 22:34:05.180] Annotations:  <none>
I0515 22:34:05.180] Replicas:     3 current / 3 desired
I0515 22:34:05.180] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0515 22:34:05.180] Pod Template:
I0515 22:34:05.180]   Labels:  app=guestbook
I0515 22:34:05.180]            tier=frontend
I0515 22:34:05.180]   Containers:
I0515 22:34:05.181]    php-redis:
I0515 22:34:05.181]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 18 lines ...
I0515 22:34:05.294] Namespace:    namespace-1557959643-10108
I0515 22:34:05.294] Selector:     app=guestbook,tier=frontend
I0515 22:34:05.295] Labels:       app=guestbook
I0515 22:34:05.295]               tier=frontend
I0515 22:34:05.295] Annotations:  <none>
I0515 22:34:05.295] Replicas:     3 current / 3 desired
I0515 22:34:05.295] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0515 22:34:05.295] Pod Template:
I0515 22:34:05.295]   Labels:  app=guestbook
I0515 22:34:05.295]            tier=frontend
I0515 22:34:05.295]   Containers:
I0515 22:34:05.295]    php-redis:
I0515 22:34:05.296]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 12 lines ...
I0515 22:34:05.410] Namespace:    namespace-1557959643-10108
I0515 22:34:05.410] Selector:     app=guestbook,tier=frontend
I0515 22:34:05.410] Labels:       app=guestbook
I0515 22:34:05.410]               tier=frontend
I0515 22:34:05.410] Annotations:  <none>
I0515 22:34:05.411] Replicas:     3 current / 3 desired
I0515 22:34:05.411] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0515 22:34:05.411] Pod Template:
I0515 22:34:05.411]   Labels:  app=guestbook
I0515 22:34:05.411]            tier=frontend
I0515 22:34:05.411]   Containers:
I0515 22:34:05.412]    php-redis:
I0515 22:34:05.412]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 18 lines ...
I0515 22:34:05.564] Namespace:    namespace-1557959643-10108
I0515 22:34:05.564] Selector:     app=guestbook,tier=frontend
I0515 22:34:05.564] Labels:       app=guestbook
I0515 22:34:05.564]               tier=frontend
I0515 22:34:05.564] Annotations:  <none>
I0515 22:34:05.565] Replicas:     3 current / 3 desired
I0515 22:34:05.565] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0515 22:34:05.565] Pod Template:
I0515 22:34:05.565]   Labels:  app=guestbook
I0515 22:34:05.565]            tier=frontend
I0515 22:34:05.565]   Containers:
I0515 22:34:05.566]    php-redis:
I0515 22:34:05.566]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
I0515 22:34:05.686] Namespace:    namespace-1557959643-10108
I0515 22:34:05.686] Selector:     app=guestbook,tier=frontend
I0515 22:34:05.686] Labels:       app=guestbook
I0515 22:34:05.686]               tier=frontend
I0515 22:34:05.686] Annotations:  <none>
I0515 22:34:05.686] Replicas:     3 current / 3 desired
I0515 22:34:05.687] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0515 22:34:05.687] Pod Template:
I0515 22:34:05.687]   Labels:  app=guestbook
I0515 22:34:05.687]            tier=frontend
I0515 22:34:05.687]   Containers:
I0515 22:34:05.687]    php-redis:
I0515 22:34:05.687]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
I0515 22:34:05.811] Namespace:    namespace-1557959643-10108
I0515 22:34:05.811] Selector:     app=guestbook,tier=frontend
I0515 22:34:05.811] Labels:       app=guestbook
I0515 22:34:05.811]               tier=frontend
I0515 22:34:05.811] Annotations:  <none>
I0515 22:34:05.811] Replicas:     3 current / 3 desired
I0515 22:34:05.811] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0515 22:34:05.811] Pod Template:
I0515 22:34:05.812]   Labels:  app=guestbook
I0515 22:34:05.812]            tier=frontend
I0515 22:34:05.812]   Containers:
I0515 22:34:05.812]    php-redis:
I0515 22:34:05.812]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 11 lines ...
I0515 22:34:05.928] Namespace:    namespace-1557959643-10108
I0515 22:34:05.928] Selector:     app=guestbook,tier=frontend
I0515 22:34:05.928] Labels:       app=guestbook
I0515 22:34:05.928]               tier=frontend
I0515 22:34:05.928] Annotations:  <none>
I0515 22:34:05.928] Replicas:     3 current / 3 desired
I0515 22:34:05.928] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0515 22:34:05.929] Pod Template:
I0515 22:34:05.929]   Labels:  app=guestbook
I0515 22:34:05.929]            tier=frontend
I0515 22:34:05.929]   Containers:
I0515 22:34:05.929]    php-redis:
I0515 22:34:05.929]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
W0515 22:34:06.233] I0515 22:34:06.140512   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1557959643-10108", Name:"frontend", UID:"faa8d9ad-01c2-48c9-ae21-f62b8f6585d9", APIVersion:"v1", ResourceVersion:"1674", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-qmngw
I0515 22:34:06.334] core.sh:1071: Successful get rc frontend {{.spec.replicas}}: 2
I0515 22:34:06.338] (Bcore.sh:1075: Successful get rc frontend {{.spec.replicas}}: 2
I0515 22:34:06.534] (Bcore.sh:1079: Successful get rc frontend {{.spec.replicas}}: 2
I0515 22:34:06.634] (Bcore.sh:1083: Successful get rc frontend {{.spec.replicas}}: 2
I0515 22:34:06.739] (Breplicationcontroller/frontend scaled
W0515 22:34:06.840] error: Expected replicas to be 3, was 2
W0515 22:34:06.841] I0515 22:34:06.744589   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1557959643-10108", Name:"frontend", UID:"faa8d9ad-01c2-48c9-ae21-f62b8f6585d9", APIVersion:"v1", ResourceVersion:"1680", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-jlq5c
I0515 22:34:06.941] core.sh:1087: Successful get rc frontend {{.spec.replicas}}: 3
I0515 22:34:06.947] (Bcore.sh:1091: Successful get rc frontend {{.spec.replicas}}: 3
I0515 22:34:07.033] (Breplicationcontroller/frontend scaled
I0515 22:34:07.134] core.sh:1095: Successful get rc frontend {{.spec.replicas}}: 2
I0515 22:34:07.219] (Breplicationcontroller "frontend" deleted
... skipping 41 lines ...
I0515 22:34:09.450] service "expose-test-deployment" deleted
I0515 22:34:09.568] Successful
I0515 22:34:09.568] message:service/expose-test-deployment exposed
I0515 22:34:09.568] has:service/expose-test-deployment exposed
I0515 22:34:09.658] service "expose-test-deployment" deleted
I0515 22:34:09.760] Successful
I0515 22:34:09.761] message:error: couldn't retrieve selectors via --selector flag or introspection: invalid deployment: no selectors, therefore cannot be exposed
I0515 22:34:09.761] See 'kubectl expose -h' for help and examples
I0515 22:34:09.761] has:invalid deployment: no selectors
I0515 22:34:09.860] Successful
I0515 22:34:09.861] message:error: couldn't retrieve selectors via --selector flag or introspection: invalid deployment: no selectors, therefore cannot be exposed
I0515 22:34:09.861] See 'kubectl expose -h' for help and examples
I0515 22:34:09.861] has:invalid deployment: no selectors
I0515 22:34:10.073] deployment.apps/nginx-deployment created
W0515 22:34:10.174] I0515 22:34:10.078324   51154 event.go:258] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1557959643-10108", Name:"nginx-deployment", UID:"70c66d3c-2aaf-4913-b4c8-5789b237dd55", APIVersion:"apps/v1", ResourceVersion:"1802", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-5cb597d4f to 3
W0515 22:34:10.175] I0515 22:34:10.083557   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959643-10108", Name:"nginx-deployment-5cb597d4f", UID:"e7cc88ae-dbf2-449e-acdd-e6fbed2ce053", APIVersion:"apps/v1", ResourceVersion:"1803", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-5cb597d4f-gv6hw
W0515 22:34:10.175] I0515 22:34:10.088815   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959643-10108", Name:"nginx-deployment-5cb597d4f", UID:"e7cc88ae-dbf2-449e-acdd-e6fbed2ce053", APIVersion:"apps/v1", ResourceVersion:"1803", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-5cb597d4f-ccmqw
... skipping 23 lines ...
I0515 22:34:12.225] service "frontend" deleted
I0515 22:34:12.232] service "frontend-2" deleted
I0515 22:34:12.241] service "frontend-3" deleted
I0515 22:34:12.248] service "frontend-4" deleted
I0515 22:34:12.254] service "frontend-5" deleted
I0515 22:34:12.361] Successful
I0515 22:34:12.361] message:error: cannot expose a Node
I0515 22:34:12.362] has:cannot expose
I0515 22:34:12.454] Successful
I0515 22:34:12.454] message:The Service "invalid-large-service-name-that-has-more-than-sixty-three-characters" is invalid: metadata.name: Invalid value: "invalid-large-service-name-that-has-more-than-sixty-three-characters": must be no more than 63 characters
I0515 22:34:12.455] has:metadata.name: Invalid value
I0515 22:34:12.552] Successful
I0515 22:34:12.553] message:service/kubernetes-serve-hostname-testing-sixty-three-characters-in-len exposed
... skipping 30 lines ...
I0515 22:34:14.770] (Bhorizontalpodautoscaler.autoscaling/frontend autoscaled
I0515 22:34:14.889] core.sh:1259: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 70
I0515 22:34:14.977] (Bhorizontalpodautoscaler.autoscaling "frontend" deleted
I0515 22:34:15.074] horizontalpodautoscaler.autoscaling/frontend autoscaled
I0515 22:34:15.174] core.sh:1263: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
I0515 22:34:15.257] (Bhorizontalpodautoscaler.autoscaling "frontend" deleted
W0515 22:34:15.358] Error: required flag(s) "max" not set
W0515 22:34:15.358] 
W0515 22:34:15.359] 
W0515 22:34:15.359] Examples:
W0515 22:34:15.359]   # Auto scale a deployment "foo", with the number of pods between 2 and 10, no target CPU utilization specified so a default autoscaling policy will be used:
W0515 22:34:15.359]   kubectl autoscale deployment foo --min=2 --max=10
W0515 22:34:15.359]   
... skipping 55 lines ...
I0515 22:34:15.604]           limits:
I0515 22:34:15.604]             cpu: 300m
I0515 22:34:15.604]           requests:
I0515 22:34:15.604]             cpu: 300m
I0515 22:34:15.604]       terminationGracePeriodSeconds: 0
I0515 22:34:15.604] status: {}
W0515 22:34:15.704] Error from server (NotFound): deployments.apps "nginx-deployment-resources" not found
I0515 22:34:15.898] deployment.apps/nginx-deployment-resources created
W0515 22:34:15.999] I0515 22:34:15.903420   51154 event.go:258] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1557959643-10108", Name:"nginx-deployment-resources", UID:"626b1e29-724d-43eb-a765-e9a15c59d7bc", APIVersion:"apps/v1", ResourceVersion:"1943", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-865b6bb7c6 to 3
W0515 22:34:15.999] I0515 22:34:15.908328   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959643-10108", Name:"nginx-deployment-resources-865b6bb7c6", UID:"8ea452b0-339e-4000-b61a-f2378801b576", APIVersion:"apps/v1", ResourceVersion:"1944", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-865b6bb7c6-6xwl9
W0515 22:34:16.000] I0515 22:34:15.914349   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959643-10108", Name:"nginx-deployment-resources-865b6bb7c6", UID:"8ea452b0-339e-4000-b61a-f2378801b576", APIVersion:"apps/v1", ResourceVersion:"1944", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-865b6bb7c6-l4jl2
W0515 22:34:16.000] I0515 22:34:15.914390   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959643-10108", Name:"nginx-deployment-resources-865b6bb7c6", UID:"8ea452b0-339e-4000-b61a-f2378801b576", APIVersion:"apps/v1", ResourceVersion:"1944", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-865b6bb7c6-d4rz6
I0515 22:34:16.101] core.sh:1278: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment-resources:
... skipping 2 lines ...
I0515 22:34:16.318] (Bdeployment.extensions/nginx-deployment-resources resource requirements updated
W0515 22:34:16.419] I0515 22:34:16.328170   51154 event.go:258] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1557959643-10108", Name:"nginx-deployment-resources", UID:"626b1e29-724d-43eb-a765-e9a15c59d7bc", APIVersion:"apps/v1", ResourceVersion:"1957", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-69b4c96c9b to 1
W0515 22:34:16.420] I0515 22:34:16.336149   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959643-10108", Name:"nginx-deployment-resources-69b4c96c9b", UID:"3be8f113-0d17-4f75-ae0e-609414362c4a", APIVersion:"apps/v1", ResourceVersion:"1958", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-69b4c96c9b-2djj4
I0515 22:34:16.520] core.sh:1283: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 100m:
I0515 22:34:16.551] (Bcore.sh:1284: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m:
I0515 22:34:16.749] (Bdeployment.extensions/nginx-deployment-resources resource requirements updated
W0515 22:34:16.850] error: unable to find container named redis
W0515 22:34:16.850] I0515 22:34:16.774168   51154 event.go:258] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1557959643-10108", Name:"nginx-deployment-resources", UID:"626b1e29-724d-43eb-a765-e9a15c59d7bc", APIVersion:"apps/v1", ResourceVersion:"1966", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-resources-865b6bb7c6 to 2
W0515 22:34:16.851] I0515 22:34:16.783083   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959643-10108", Name:"nginx-deployment-resources-865b6bb7c6", UID:"8ea452b0-339e-4000-b61a-f2378801b576", APIVersion:"apps/v1", ResourceVersion:"1970", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-resources-865b6bb7c6-6xwl9
W0515 22:34:16.851] I0515 22:34:16.796375   51154 event.go:258] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1557959643-10108", Name:"nginx-deployment-resources", UID:"626b1e29-724d-43eb-a765-e9a15c59d7bc", APIVersion:"apps/v1", ResourceVersion:"1969", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-7bb7d84c58 to 1
W0515 22:34:16.851] I0515 22:34:16.799911   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959643-10108", Name:"nginx-deployment-resources-7bb7d84c58", UID:"0cdbc1f9-1f9e-478b-b112-9a8b54c6993f", APIVersion:"apps/v1", ResourceVersion:"1976", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-7bb7d84c58-mdvm5
I0515 22:34:16.952] core.sh:1289: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
I0515 22:34:16.981] (Bcore.sh:1290: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m:
... skipping 211 lines ...
I0515 22:34:17.535]     status: "True"
I0515 22:34:17.535]     type: Progressing
I0515 22:34:17.535]   observedGeneration: 4
I0515 22:34:17.535]   replicas: 4
I0515 22:34:17.535]   unavailableReplicas: 4
I0515 22:34:17.535]   updatedReplicas: 1
W0515 22:34:17.636] error: you must specify resources by --filename when --local is set.
W0515 22:34:17.636] Example resource specifications include:
W0515 22:34:17.636]    '-f rsrc.yaml'
W0515 22:34:17.636]    '--filename=rsrc.json'
I0515 22:34:17.737] core.sh:1299: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
I0515 22:34:17.813] (Bcore.sh:1300: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 300m:
I0515 22:34:17.915] (Bcore.sh:1301: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.requests.cpu}}:{{end}}: 300m:
... skipping 44 lines ...
I0515 22:34:19.538]                 pod-template-hash=75c7695cbd
I0515 22:34:19.538] Annotations:    deployment.kubernetes.io/desired-replicas: 1
I0515 22:34:19.538]                 deployment.kubernetes.io/max-replicas: 2
I0515 22:34:19.539]                 deployment.kubernetes.io/revision: 1
I0515 22:34:19.539] Controlled By:  Deployment/test-nginx-apps
I0515 22:34:19.539] Replicas:       1 current / 1 desired
I0515 22:34:19.539] Pods Status:    0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0515 22:34:19.539] Pod Template:
I0515 22:34:19.539]   Labels:  app=test-nginx-apps
I0515 22:34:19.540]            pod-template-hash=75c7695cbd
I0515 22:34:19.540]   Containers:
I0515 22:34:19.540]    nginx:
I0515 22:34:19.540]     Image:        k8s.gcr.io/nginx:test-cmd
... skipping 90 lines ...
I0515 22:34:24.219] (B    Image:	k8s.gcr.io/nginx:test-cmd
I0515 22:34:24.317] apps.sh:296: Successful get deployment.apps {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
I0515 22:34:24.432] (Bdeployment.extensions/nginx rolled back
I0515 22:34:25.548] apps.sh:300: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0515 22:34:25.756] (Bapps.sh:303: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0515 22:34:25.866] (Bdeployment.extensions/nginx rolled back
W0515 22:34:25.967] error: unable to find specified revision 1000000 in history
I0515 22:34:26.973] apps.sh:307: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
I0515 22:34:27.073] (Bdeployment.extensions/nginx paused
W0515 22:34:27.191] error: you cannot rollback a paused deployment; resume it first with 'kubectl rollout resume deployment/nginx' and try again
W0515 22:34:27.285] error: deployments.extensions "nginx" can't restart paused deployment (run rollout resume first)
I0515 22:34:27.389] deployment.extensions/nginx resumed
I0515 22:34:27.521] deployment.extensions/nginx rolled back
I0515 22:34:27.728]     deployment.kubernetes.io/revision-history: 1,3
W0515 22:34:27.921] error: desired revision (3) is different from the running revision (5)
I0515 22:34:28.027] deployment.extensions/nginx restarted
W0515 22:34:28.128] I0515 22:34:28.048347   51154 event.go:258] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1557959658-17093", Name:"nginx", UID:"c77ac597-f3e7-4da8-8f6d-12186c831935", APIVersion:"apps/v1", ResourceVersion:"2190", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-958dc566b to 2
W0515 22:34:28.128] I0515 22:34:28.054550   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959658-17093", Name:"nginx-958dc566b", UID:"32f6ef87-c032-4d39-8b06-695046c512e6", APIVersion:"apps/v1", ResourceVersion:"2194", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-958dc566b-57wqz
W0515 22:34:28.129] I0515 22:34:28.069133   51154 event.go:258] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1557959658-17093", Name:"nginx", UID:"c77ac597-f3e7-4da8-8f6d-12186c831935", APIVersion:"apps/v1", ResourceVersion:"2193", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-6c779cd8dc to 1
W0515 22:34:28.129] I0515 22:34:28.073263   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959658-17093", Name:"nginx-6c779cd8dc", UID:"2a78c779-dcf2-4130-a73d-e237e58f1f13", APIVersion:"apps/v1", ResourceVersion:"2201", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-6c779cd8dc-xqt9z
I0515 22:34:29.245] Successful
... skipping 143 lines ...
I0515 22:34:30.428] (Bdeployment.extensions/nginx-deployment image updated
W0515 22:34:30.529] I0515 22:34:30.434627   51154 event.go:258] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1557959658-17093", Name:"nginx-deployment", UID:"c6ac5fcc-b038-4eb2-ae15-f39f3fb9864d", APIVersion:"apps/v1", ResourceVersion:"2258", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-64f55cb875 to 1
W0515 22:34:30.529] I0515 22:34:30.438883   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959658-17093", Name:"nginx-deployment-64f55cb875", UID:"a9e00dd0-c9b4-428d-a732-324a59e653b6", APIVersion:"apps/v1", ResourceVersion:"2259", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-64f55cb875-svz5r
I0515 22:34:30.630] apps.sh:345: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
I0515 22:34:30.651] (Bapps.sh:346: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
I0515 22:34:30.884] (Bdeployment.extensions/nginx-deployment image updated
W0515 22:34:30.985] error: unable to find container named "redis"
I0515 22:34:31.085] apps.sh:351: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0515 22:34:31.097] (Bapps.sh:352: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
I0515 22:34:31.196] (Bdeployment.apps/nginx-deployment image updated
I0515 22:34:31.299] apps.sh:355: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
I0515 22:34:31.393] (Bapps.sh:356: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
I0515 22:34:31.574] (Bapps.sh:359: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
... skipping 65 lines ...
I0515 22:34:35.478] (Breplicaset.apps/frontend created
I0515 22:34:35.505] +++ [0515 22:34:35] Deleting rs
I0515 22:34:35.590] replicaset.extensions "frontend" deleted
I0515 22:34:35.694] apps.sh:516: Successful get pods -l "tier=frontend" {{range.items}}{{.metadata.name}}:{{end}}: 
I0515 22:34:35.790] (Bapps.sh:520: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
I0515 22:34:35.991] (Breplicaset.apps/frontend-no-cascade created
W0515 22:34:36.092] E0515 22:34:34.881764   51154 replica_set.go:450] Sync "namespace-1557959658-17093/nginx-deployment-57b54775" failed with Operation cannot be fulfilled on replicasets.apps "nginx-deployment-57b54775": StorageError: invalid object, Code: 4, Key: /registry/replicasets/namespace-1557959658-17093/nginx-deployment-57b54775, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: e86f680f-61a5-4a37-b8b8-a45866d64ae9, UID in object meta: 
W0515 22:34:36.093] E0515 22:34:34.930958   51154 replica_set.go:450] Sync "namespace-1557959658-17093/nginx-deployment-5dd68b6c76" failed with replicasets.apps "nginx-deployment-5dd68b6c76" not found
W0515 22:34:36.093] E0515 22:34:34.981520   51154 replica_set.go:450] Sync "namespace-1557959658-17093/nginx-deployment-5dfd5c49d4" failed with replicasets.apps "nginx-deployment-5dfd5c49d4" not found
W0515 22:34:36.093] E0515 22:34:35.031625   51154 replica_set.go:450] Sync "namespace-1557959658-17093/nginx-deployment-7d8bf5bf54" failed with replicasets.apps "nginx-deployment-7d8bf5bf54" not found
W0515 22:34:36.093] I0515 22:34:35.484933   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959675-27364", Name:"frontend", UID:"1985617d-ef02-4985-8693-f2f076ebe8f6", APIVersion:"apps/v1", ResourceVersion:"2435", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-n9qhq
W0515 22:34:36.094] I0515 22:34:35.490199   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959675-27364", Name:"frontend", UID:"1985617d-ef02-4985-8693-f2f076ebe8f6", APIVersion:"apps/v1", ResourceVersion:"2435", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-fjr7k
W0515 22:34:36.094] I0515 22:34:35.490268   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959675-27364", Name:"frontend", UID:"1985617d-ef02-4985-8693-f2f076ebe8f6", APIVersion:"apps/v1", ResourceVersion:"2435", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-prbkf
W0515 22:34:36.094] I0515 22:34:35.997245   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959675-27364", Name:"frontend-no-cascade", UID:"94c356f4-d797-4d75-9e8c-cd17609ffcd3", APIVersion:"apps/v1", ResourceVersion:"2452", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-no-cascade-nnmm4
W0515 22:34:36.094] I0515 22:34:36.001976   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959675-27364", Name:"frontend-no-cascade", UID:"94c356f4-d797-4d75-9e8c-cd17609ffcd3", APIVersion:"apps/v1", ResourceVersion:"2452", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-no-cascade-bpckf
W0515 22:34:36.095] I0515 22:34:36.002312   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1557959675-27364", Name:"frontend-no-cascade", UID:"94c356f4-d797-4d75-9e8c-cd17609ffcd3", APIVersion:"apps/v1", ResourceVersion:"2452", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-no-cascade-6rbl5
I0515 22:34:36.195] apps.sh:526: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
I0515 22:34:36.196] (B+++ [0515 22:34:36] Deleting rs
I0515 22:34:36.198] replicaset.extensions "frontend-no-cascade" deleted
W0515 22:34:36.298] E0515 22:34:36.220258   51154 replica_set.go:450] Sync "namespace-1557959675-27364/frontend-no-cascade" failed with Operation cannot be fulfilled on replicasets.apps "frontend-no-cascade": StorageError: invalid object, Code: 4, Key: /registry/replicasets/namespace-1557959675-27364/frontend-no-cascade, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 94c356f4-d797-4d75-9e8c-cd17609ffcd3, UID in object meta: 
I0515 22:34:36.399] apps.sh:530: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
I0515 22:34:36.423] (Bapps.sh:532: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
I0515 22:34:36.514] (Bpod "frontend-no-cascade-6rbl5" deleted
I0515 22:34:36.520] pod "frontend-no-cascade-bpckf" deleted
I0515 22:34:36.527] pod "frontend-no-cascade-nnmm4" deleted
I0515 22:34:36.633] apps.sh:535: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
... skipping 8 lines ...
I0515 22:34:37.203] Namespace:    namespace-1557959675-27364
I0515 22:34:37.203] Selector:     app=guestbook,tier=frontend
I0515 22:34:37.203] Labels:       app=guestbook
I0515 22:34:37.203]               tier=frontend
I0515 22:34:37.204] Annotations:  <none>
I0515 22:34:37.204] Replicas:     3 current / 3 desired
I0515 22:34:37.204] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0515 22:34:37.204] Pod Template:
I0515 22:34:37.204]   Labels:  app=guestbook
I0515 22:34:37.204]            tier=frontend
I0515 22:34:37.205]   Containers:
I0515 22:34:37.205]    php-redis:
I0515 22:34:37.205]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
I0515 22:34:37.327] Namespace:    namespace-1557959675-27364
I0515 22:34:37.327] Selector:     app=guestbook,tier=frontend
I0515 22:34:37.327] Labels:       app=guestbook
I0515 22:34:37.327]               tier=frontend
I0515 22:34:37.327] Annotations:  <none>
I0515 22:34:37.328] Replicas:     3 current / 3 desired
I0515 22:34:37.328] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0515 22:34:37.328] Pod Template:
I0515 22:34:37.328]   Labels:  app=guestbook
I0515 22:34:37.328]            tier=frontend
I0515 22:34:37.328]   Containers:
I0515 22:34:37.329]    php-redis:
I0515 22:34:37.329]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 18 lines ...
I0515 22:34:37.440] Namespace:    namespace-1557959675-27364
I0515 22:34:37.440] Selector:     app=guestbook,tier=frontend
I0515 22:34:37.440] Labels:       app=guestbook
I0515 22:34:37.440]               tier=frontend
I0515 22:34:37.440] Annotations:  <none>
I0515 22:34:37.440] Replicas:     3 current / 3 desired
I0515 22:34:37.440] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0515 22:34:37.440] Pod Template:
I0515 22:34:37.441]   Labels:  app=guestbook
I0515 22:34:37.441]            tier=frontend
I0515 22:34:37.441]   Containers:
I0515 22:34:37.441]    php-redis:
I0515 22:34:37.441]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 12 lines ...
I0515 22:34:37.560] Namespace:    namespace-1557959675-27364
I0515 22:34:37.560] Selector:     app=guestbook,tier=frontend
I0515 22:34:37.560] Labels:       app=guestbook
I0515 22:34:37.560]               tier=frontend
I0515 22:34:37.560] Annotations:  <none>
I0515 22:34:37.560] Replicas:     3 current / 3 desired
I0515 22:34:37.561] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0515 22:34:37.561] Pod Template:
I0515 22:34:37.561]   Labels:  app=guestbook
I0515 22:34:37.561]            tier=frontend
I0515 22:34:37.561]   Containers:
I0515 22:34:37.561]    php-redis:
I0515 22:34:37.561]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 18 lines ...
I0515 22:34:37.718] Namespace:    namespace-1557959675-27364
I0515 22:34:37.718] Selector:     app=guestbook,tier=frontend
I0515 22:34:37.718] Labels:       app=guestbook
I0515 22:34:37.719]               tier=frontend
I0515 22:34:37.719] Annotations:  <none>
I0515 22:34:37.719] Replicas:     3 current / 3 desired
I0515 22:34:37.719] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0515 22:34:37.719] Pod Template:
I0515 22:34:37.719]   Labels:  app=guestbook
I0515 22:34:37.719]            tier=frontend
I0515 22:34:37.719]   Containers:
I0515 22:34:37.719]    php-redis:
I0515 22:34:37.720]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
I0515 22:34:37.831] Namespace:    namespace-1557959675-27364
I0515 22:34:37.831] Selector:     app=guestbook,tier=frontend
I0515 22:34:37.831] Labels:       app=guestbook
I0515 22:34:37.831]               tier=frontend
I0515 22:34:37.831] Annotations:  <none>
I0515 22:34:37.831] Replicas:     3 current / 3 desired
I0515 22:34:37.831] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0515 22:34:37.832] Pod Template:
I0515 22:34:37.832]   Labels:  app=guestbook
I0515 22:34:37.832]            tier=frontend
I0515 22:34:37.832]   Containers:
I0515 22:34:37.832]    php-redis:
I0515 22:34:37.832]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
I0515 22:34:37.936] Namespace:    namespace-1557959675-27364
I0515 22:34:37.937] Selector:     app=guestbook,tier=frontend
I0515 22:34:37.937] Labels:       app=guestbook
I0515 22:34:37.937]               tier=frontend
I0515 22:34:37.937] Annotations:  <none>
I0515 22:34:37.937] Replicas:     3 current / 3 desired
I0515 22:34:37.937] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0515 22:34:37.937] Pod Template:
I0515 22:34:37.937]   Labels:  app=guestbook
I0515 22:34:37.937]            tier=frontend
I0515 22:34:37.938]   Containers:
I0515 22:34:37.938]    php-redis:
I0515 22:34:37.938]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 11 lines ...
I0515 22:34:38.050] Namespace:    namespace-1557959675-27364
I0515 22:34:38.050] Selector:     app=guestbook,tier=frontend
I0515 22:34:38.050] Labels:       app=guestbook
I0515 22:34:38.050]               tier=frontend
I0515 22:34:38.050] Annotations:  <none>
I0515 22:34:38.050] Replicas:     3 current / 3 desired
I0515 22:34:38.050] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0515 22:34:38.050] Pod Template:
I0515 22:34:38.050]   Labels:  app=guestbook
I0515 22:34:38.051]            tier=frontend
I0515 22:34:38.051]   Containers:
I0515 22:34:38.051]    php-redis:
I0515 22:34:38.051]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 181 lines ...
I0515 22:34:43.668] (Bhorizontalpodautoscaler.autoscaling/frontend autoscaled
I0515 22:34:43.754] apps.sh:651: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 70
I0515 22:34:43.829] (Bhorizontalpodautoscaler.autoscaling "frontend" deleted
I0515 22:34:43.916] horizontalpodautoscaler.autoscaling/frontend autoscaled
I0515 22:34:44.014] apps.sh:655: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
I0515 22:34:44.094] (Bhorizontalpodautoscaler.autoscaling "frontend" deleted
W0515 22:34:44.195] Error: required flag(s) "max" not set
W0515 22:34:44.195] 
W0515 22:34:44.195] 
W0515 22:34:44.195] Examples:
W0515 22:34:44.196]   # Auto scale a deployment "foo", with the number of pods between 2 and 10, no target CPU utilization specified so a default autoscaling policy will be used:
W0515 22:34:44.196]   kubectl autoscale deployment foo --min=2 --max=10
W0515 22:34:44.196]   
... skipping 89 lines ...
I0515 22:34:47.417] (Bapps.sh:439: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
I0515 22:34:47.509] (Bapps.sh:440: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
I0515 22:34:47.615] (Bstatefulset.apps/nginx rolled back
I0515 22:34:47.727] apps.sh:443: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
I0515 22:34:47.826] (Bapps.sh:444: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I0515 22:34:47.930] (BSuccessful
I0515 22:34:47.931] message:error: unable to find specified revision 1000000 in history
I0515 22:34:47.931] has:unable to find specified revision
I0515 22:34:48.032] apps.sh:448: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
I0515 22:34:48.124] (Bapps.sh:449: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I0515 22:34:48.233] (Bstatefulset.apps/nginx rolled back
I0515 22:34:48.333] apps.sh:452: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
I0515 22:34:48.424] (Bapps.sh:453: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
... skipping 58 lines ...
I0515 22:34:50.427] Name:         mock
I0515 22:34:50.427] Namespace:    namespace-1557959689-32318
I0515 22:34:50.427] Selector:     app=mock
I0515 22:34:50.427] Labels:       app=mock
I0515 22:34:50.427] Annotations:  <none>
I0515 22:34:50.428] Replicas:     1 current / 1 desired
I0515 22:34:50.428] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0515 22:34:50.428] Pod Template:
I0515 22:34:50.428]   Labels:  app=mock
I0515 22:34:50.428]   Containers:
I0515 22:34:50.429]    mock-container:
I0515 22:34:50.429]     Image:        k8s.gcr.io/pause:2.0
I0515 22:34:50.429]     Port:         9949/TCP
... skipping 56 lines ...
I0515 22:34:52.840] Name:         mock
I0515 22:34:52.840] Namespace:    namespace-1557959689-32318
I0515 22:34:52.840] Selector:     app=mock
I0515 22:34:52.840] Labels:       app=mock
I0515 22:34:52.841] Annotations:  <none>
I0515 22:34:52.841] Replicas:     1 current / 1 desired
I0515 22:34:52.841] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0515 22:34:52.841] Pod Template:
I0515 22:34:52.841]   Labels:  app=mock
I0515 22:34:52.841]   Containers:
I0515 22:34:52.841]    mock-container:
I0515 22:34:52.841]     Image:        k8s.gcr.io/pause:2.0
I0515 22:34:52.841]     Port:         9949/TCP
... skipping 56 lines ...
I0515 22:34:55.318] Name:         mock
I0515 22:34:55.318] Namespace:    namespace-1557959689-32318
I0515 22:34:55.318] Selector:     app=mock
I0515 22:34:55.318] Labels:       app=mock
I0515 22:34:55.319] Annotations:  <none>
I0515 22:34:55.319] Replicas:     1 current / 1 desired
I0515 22:34:55.319] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0515 22:34:55.319] Pod Template:
I0515 22:34:55.319]   Labels:  app=mock
I0515 22:34:55.319]   Containers:
I0515 22:34:55.319]    mock-container:
I0515 22:34:55.319]     Image:        k8s.gcr.io/pause:2.0
I0515 22:34:55.320]     Port:         9949/TCP
... skipping 41 lines ...
I0515 22:34:57.670] Namespace:    namespace-1557959689-32318
I0515 22:34:57.670] Selector:     app=mock
I0515 22:34:57.671] Labels:       app=mock
I0515 22:34:57.671]               status=replaced
I0515 22:34:57.671] Annotations:  <none>
I0515 22:34:57.671] Replicas:     1 current / 1 desired
I0515 22:34:57.671] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0515 22:34:57.671] Pod Template:
I0515 22:34:57.671]   Labels:  app=mock
I0515 22:34:57.671]   Containers:
I0515 22:34:57.671]    mock-container:
I0515 22:34:57.672]     Image:        k8s.gcr.io/pause:2.0
I0515 22:34:57.672]     Port:         9949/TCP
... skipping 11 lines ...
I0515 22:34:57.680] Namespace:    namespace-1557959689-32318
I0515 22:34:57.680] Selector:     app=mock2
I0515 22:34:57.680] Labels:       app=mock2
I0515 22:34:57.680]               status=replaced
I0515 22:34:57.680] Annotations:  <none>
I0515 22:34:57.681] Replicas:     1 current / 1 desired
I0515 22:34:57.681] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0515 22:34:57.681] Pod Template:
I0515 22:34:57.681]   Labels:  app=mock2
I0515 22:34:57.681]   Containers:
I0515 22:34:57.681]    mock-container:
I0515 22:34:57.682]     Image:        k8s.gcr.io/pause:2.0
I0515 22:34:57.682]     Port:         9949/TCP
... skipping 104 lines ...
I0515 22:35:03.216] +++ [0515 22:35:03] Creating namespace namespace-1557959703-30947
I0515 22:35:03.292] namespace/namespace-1557959703-30947 created
I0515 22:35:03.367] Context "test" modified.
I0515 22:35:03.376] +++ [0515 22:35:03] Testing persistent volumes
I0515 22:35:03.474] storage.sh:30: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
I0515 22:35:03.670] (Bpersistentvolume/pv0001 created
W0515 22:35:03.771] E0515 22:35:03.678215   51154 pv_protection_controller.go:117] PV pv0001 failed with : Operation cannot be fulfilled on persistentvolumes "pv0001": the object has been modified; please apply your changes to the latest version and try again
I0515 22:35:03.872] storage.sh:33: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0001:
I0515 22:35:03.872] (Bpersistentvolume "pv0001" deleted
I0515 22:35:04.066] persistentvolume/pv0002 created
W0515 22:35:04.167] E0515 22:35:04.071010   51154 pv_protection_controller.go:117] PV pv0002 failed with : Operation cannot be fulfilled on persistentvolumes "pv0002": the object has been modified; please apply your changes to the latest version and try again
I0515 22:35:04.268] storage.sh:36: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0002:
I0515 22:35:04.268] (Bpersistentvolume "pv0002" deleted
I0515 22:35:04.462] persistentvolume/pv0003 created
I0515 22:35:04.578] storage.sh:39: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0003:
I0515 22:35:04.654] (Bpersistentvolume "pv0003" deleted
I0515 22:35:04.769] storage.sh:42: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
... skipping 498 lines ...
I0515 22:35:10.469] yes
I0515 22:35:10.469] has:the server doesn't have a resource type
I0515 22:35:10.551] Successful
I0515 22:35:10.551] message:yes
I0515 22:35:10.551] has:yes
I0515 22:35:10.628] Successful
I0515 22:35:10.628] message:error: --subresource can not be used with NonResourceURL
I0515 22:35:10.629] has:subresource can not be used with NonResourceURL
I0515 22:35:10.716] Successful
I0515 22:35:10.808] Successful
I0515 22:35:10.809] message:yes
I0515 22:35:10.809] 0
I0515 22:35:10.809] has:0
... skipping 27 lines ...
I0515 22:35:11.452] role.rbac.authorization.k8s.io/testing-R reconciled
I0515 22:35:11.554] legacy-script.sh:801: Successful get rolebindings -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-RB:
I0515 22:35:11.649] (Blegacy-script.sh:802: Successful get roles -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-R:
I0515 22:35:11.752] (Blegacy-script.sh:803: Successful get clusterrolebindings -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CRB:
I0515 22:35:11.853] (Blegacy-script.sh:804: Successful get clusterroles -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CR:
I0515 22:35:11.941] (BSuccessful
I0515 22:35:11.942] message:error: only rbac.authorization.k8s.io/v1 is supported: not *v1beta1.ClusterRole
I0515 22:35:11.942] has:only rbac.authorization.k8s.io/v1 is supported
I0515 22:35:12.036] rolebinding.rbac.authorization.k8s.io "testing-RB" deleted
I0515 22:35:12.042] role.rbac.authorization.k8s.io "testing-R" deleted
I0515 22:35:12.053] clusterrole.rbac.authorization.k8s.io "testing-CR" deleted
I0515 22:35:12.062] clusterrolebinding.rbac.authorization.k8s.io "testing-CRB" deleted
I0515 22:35:12.076] Recording: run_retrieve_multiple_tests
... skipping 45 lines ...
I0515 22:35:13.380] +++ Running case: test-cmd.run_kubectl_explain_tests 
I0515 22:35:13.382] +++ working dir: /go/src/k8s.io/kubernetes
I0515 22:35:13.386] +++ command: run_kubectl_explain_tests
I0515 22:35:13.397] +++ [0515 22:35:13] Testing kubectl(v1:explain)
W0515 22:35:13.497] I0515 22:35:13.240133   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1557959712-13579", Name:"cassandra", UID:"6cc1b32c-2af4-4742-ac63-acf2185ae7ec", APIVersion:"v1", ResourceVersion:"3020", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-fclv6
W0515 22:35:13.498] I0515 22:35:13.257821   51154 event.go:258] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1557959712-13579", Name:"cassandra", UID:"6cc1b32c-2af4-4742-ac63-acf2185ae7ec", APIVersion:"v1", ResourceVersion:"3020", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-9zh22
W0515 22:35:13.498] E0515 22:35:13.262174   51154 replica_set.go:450] Sync "namespace-1557959712-13579/cassandra" failed with replicationcontrollers "cassandra" not found
I0515 22:35:13.598] KIND:     Pod
I0515 22:35:13.598] VERSION:  v1
I0515 22:35:13.599] 
I0515 22:35:13.599] DESCRIPTION:
I0515 22:35:13.599]      Pod is a collection of containers that can run on a host. This resource is
I0515 22:35:13.599]      created by clients and scheduled onto hosts.
... skipping 977 lines ...
I0515 22:35:42.012] message:node/127.0.0.1 already uncordoned (dry run)
I0515 22:35:42.012] has:already uncordoned
I0515 22:35:42.108] node-management.sh:119: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
I0515 22:35:42.218] (Bnode/127.0.0.1 labeled
I0515 22:35:42.320] node-management.sh:124: Successful get nodes 127.0.0.1 {{.metadata.labels.test}}: label
I0515 22:35:42.396] (BSuccessful
I0515 22:35:42.397] message:error: cannot specify both a node name and a --selector option
I0515 22:35:42.397] See 'kubectl drain -h' for help and examples
I0515 22:35:42.397] has:cannot specify both a node name
I0515 22:35:42.471] Successful
I0515 22:35:42.472] message:error: USAGE: cordon NODE [flags]
I0515 22:35:42.472] See 'kubectl cordon -h' for help and examples
I0515 22:35:42.472] has:error\: USAGE\: cordon NODE
I0515 22:35:42.553] node/127.0.0.1 already uncordoned
I0515 22:35:42.634] Successful
I0515 22:35:42.634] message:error: You must provide one or more resources by argument or filename.
I0515 22:35:42.634] Example resource specifications include:
I0515 22:35:42.634]    '-f rsrc.yaml'
I0515 22:35:42.634]    '--filename=rsrc.json'
I0515 22:35:42.634]    '<resource> <name>'
I0515 22:35:42.635]    '<resource>'
I0515 22:35:42.635] has:must provide one or more resources
... skipping 15 lines ...
I0515 22:35:43.129] Successful
I0515 22:35:43.129] message:The following compatible plugins are available:
I0515 22:35:43.129] 
I0515 22:35:43.129] test/fixtures/pkg/kubectl/plugins/version/kubectl-version
I0515 22:35:43.129]   - warning: kubectl-version overwrites existing command: "kubectl version"
I0515 22:35:43.130] 
I0515 22:35:43.130] error: one plugin warning was found
I0515 22:35:43.130] has:kubectl-version overwrites existing command: "kubectl version"
I0515 22:35:43.212] Successful
I0515 22:35:43.212] message:The following compatible plugins are available:
I0515 22:35:43.213] 
I0515 22:35:43.213] test/fixtures/pkg/kubectl/plugins/kubectl-foo
I0515 22:35:43.213] test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo
I0515 22:35:43.213]   - warning: test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin: test/fixtures/pkg/kubectl/plugins/kubectl-foo
I0515 22:35:43.213] 
I0515 22:35:43.213] error: one plugin warning was found
I0515 22:35:43.214] has:test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin
I0515 22:35:43.289] Successful
I0515 22:35:43.289] message:The following compatible plugins are available:
I0515 22:35:43.289] 
I0515 22:35:43.290] test/fixtures/pkg/kubectl/plugins/kubectl-foo
I0515 22:35:43.290] has:plugins are available
I0515 22:35:43.368] Successful
I0515 22:35:43.369] message:Unable read directory "test/fixtures/pkg/kubectl/plugins/empty" from your PATH: open test/fixtures/pkg/kubectl/plugins/empty: no such file or directory. Skipping...
I0515 22:35:43.369] error: unable to find any kubectl plugins in your PATH
I0515 22:35:43.369] has:unable to find any kubectl plugins in your PATH
I0515 22:35:43.447] Successful
I0515 22:35:43.448] message:I am plugin foo
I0515 22:35:43.448] has:plugin foo
I0515 22:35:43.529] Successful
I0515 22:35:43.529] message:Client Version: version.Info{Major:"1", Minor:"16+", GitVersion:"v1.16.0-alpha.0.62+6936c2e3ebb85f", GitCommit:"6936c2e3ebb85f930b4f974050c2e47abe7f30c7", GitTreeState:"clean", BuildDate:"2019-05-15T22:28:29Z", GoVersion:"go1.12.1", Compiler:"gc", Platform:"linux/amd64"}
... skipping 9 lines ...
I0515 22:35:43.615] 
I0515 22:35:43.618] +++ Running case: test-cmd.run_impersonation_tests 
I0515 22:35:43.620] +++ working dir: /go/src/k8s.io/kubernetes
I0515 22:35:43.624] +++ command: run_impersonation_tests
I0515 22:35:43.634] +++ [0515 22:35:43] Testing impersonation
I0515 22:35:43.712] Successful
I0515 22:35:43.712] message:error: requesting groups or user-extra for  without impersonating a user
I0515 22:35:43.712] has:without impersonating a user
I0515 22:35:43.904] certificatesigningrequest.certificates.k8s.io/foo created
I0515 22:35:44.013] authorization.sh:68: Successful get csr/foo {{.spec.username}}: user1
I0515 22:35:44.107] (Bauthorization.sh:69: Successful get csr/foo {{range .spec.groups}}{{.}}{{end}}: system:authenticated
I0515 22:35:44.191] (Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
I0515 22:35:44.412] certificatesigningrequest.certificates.k8s.io/foo created
... skipping 52 lines ...
W0515 22:35:47.721] I0515 22:35:47.716641   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.721] I0515 22:35:47.716784   47798 controller.go:176] Shutting down kubernetes service endpoint reconciler
W0515 22:35:47.722] I0515 22:35:47.716827   47798 clientconn.go:1016] blockingPicker: the picked transport is not ready, loop back to repick
W0515 22:35:47.722] I0515 22:35:47.716981   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.722] I0515 22:35:47.717032   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.723] I0515 22:35:47.715272   47798 secure_serving.go:160] Stopped listening on 127.0.0.1:8080
W0515 22:35:47.723] W0515 22:35:47.717429   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.723] I0515 22:35:47.717579   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.724] I0515 22:35:47.717592   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.724] I0515 22:35:47.717613   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.724] I0515 22:35:47.717620   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.724] I0515 22:35:47.717653   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.725] I0515 22:35:47.717664   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
... skipping 12 lines ...
W0515 22:35:47.728] I0515 22:35:47.717827   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.729] I0515 22:35:47.718055   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.729] I0515 22:35:47.717437   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.729] I0515 22:35:47.718069   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.730] I0515 22:35:47.717928   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.730] I0515 22:35:47.718079   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.730] W0515 22:35:47.718645   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.731] I0515 22:35:47.718729   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.731] I0515 22:35:47.718770   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.731] I0515 22:35:47.718793   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.732] I0515 22:35:47.718806   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.732] I0515 22:35:47.718884   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.732] I0515 22:35:47.718894   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
... skipping 36 lines ...
W0515 22:35:47.742] I0515 22:35:47.720055   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.742] I0515 22:35:47.720069   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.743] I0515 22:35:47.720091   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.743] I0515 22:35:47.720098   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.743] I0515 22:35:47.720104   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.743] I0515 22:35:47.720115   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.744] W0515 22:35:47.720143   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.744] W0515 22:35:47.720151   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.744] W0515 22:35:47.720167   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.745] W0515 22:35:47.720194   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.745] W0515 22:35:47.720210   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.746] W0515 22:35:47.720259   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.746] W0515 22:35:47.720281   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.747] W0515 22:35:47.720289   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.747] W0515 22:35:47.720295   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.747] I0515 22:35:47.720302   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.747] W0515 22:35:47.720323   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.748] I0515 22:35:47.720337   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.748] W0515 22:35:47.720336   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.748] W0515 22:35:47.720352   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.749] W0515 22:35:47.720370   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.749] W0515 22:35:47.720390   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.750] W0515 22:35:47.720402   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.750] W0515 22:35:47.720371   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.750] W0515 22:35:47.720262   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.751] W0515 22:35:47.720434   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.751] W0515 22:35:47.720202   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.752] W0515 22:35:47.720455   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.752] W0515 22:35:47.720465   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.753] W0515 22:35:47.720494   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.753] W0515 22:35:47.720502   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.753] W0515 22:35:47.720528   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.754] W0515 22:35:47.720543   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.754] W0515 22:35:47.720557   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.755] W0515 22:35:47.720565   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.755] I0515 22:35:47.720565   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.755] I0515 22:35:47.720591   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.755] W0515 22:35:47.720530   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.756] W0515 22:35:47.720593   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.756] W0515 22:35:47.720574   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.756] W0515 22:35:47.720589   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.757] W0515 22:35:47.720630   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.757] W0515 22:35:47.720631   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.757] W0515 22:35:47.720403   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.758] W0515 22:35:47.720504   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.758] W0515 22:35:47.720705   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.758] W0515 22:35:47.720751   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.759] W0515 22:35:47.720600   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.759] I0515 22:35:47.720893   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.759] I0515 22:35:47.720905   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.760] I0515 22:35:47.720913   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.760] I0515 22:35:47.720925   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.760] I0515 22:35:47.720944   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.760] I0515 22:35:47.721010   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.761] W0515 22:35:47.721064   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.761] I0515 22:35:47.721110   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.761] I0515 22:35:47.721126   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.762] I0515 22:35:47.721129   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.762] I0515 22:35:47.721141   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.762] I0515 22:35:47.721174   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.762] I0515 22:35:47.721193   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.762] I0515 22:35:47.721216   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.763] I0515 22:35:47.721244   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.763] W0515 22:35:47.720314   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.763] I0515 22:35:47.720972   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.763] I0515 22:35:47.721391   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.764] I0515 22:35:47.721396   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.764] I0515 22:35:47.721400   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.764] I0515 22:35:47.721013   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.764] I0515 22:35:47.720993   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
... skipping 4 lines ...
W0515 22:35:47.766] I0515 22:35:47.721589   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.766] I0515 22:35:47.721622   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.766] I0515 22:35:47.721655   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.766] I0515 22:35:47.721659   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.767] I0515 22:35:47.721952   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.767] I0515 22:35:47.720531   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.768] W0515 22:35:47.722011   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.768] W0515 22:35:47.722056   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.768] W0515 22:35:47.722009   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.769] I0515 22:35:47.721074   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.769] I0515 22:35:47.721374   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.769] W0515 22:35:47.721957   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.770] I0515 22:35:47.721420   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.770] I0515 22:35:47.722116   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.770] I0515 22:35:47.721738   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.770] I0515 22:35:47.722131   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.771] I0515 22:35:47.722133   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.771] I0515 22:35:47.721786   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.771] I0515 22:35:47.722142   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.771] I0515 22:35:47.721464   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.772] I0515 22:35:47.722169   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.772] I0515 22:35:47.722194   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.772] I0515 22:35:47.720626   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.772] W0515 22:35:47.722205   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.773] I0515 22:35:47.722213   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.773] W0515 22:35:47.722262   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.773] W0515 22:35:47.722271   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.774] W0515 22:35:47.722442   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.774] I0515 22:35:47.722176   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.774] W0515 22:35:47.722523   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.775] I0515 22:35:47.721637   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.775] I0515 22:35:47.723243   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.775] W0515 22:35:47.722299   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.775] I0515 22:35:47.722139   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.776] W0515 22:35:47.721818   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.776] W0515 22:35:47.721848   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.777] W0515 22:35:47.721859   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.777] W0515 22:35:47.721881   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.777] W0515 22:35:47.721891   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.777] W0515 22:35:47.721908   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.778] W0515 22:35:47.721933   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.778] I0515 22:35:47.721958   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.778] I0515 22:35:47.721972   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.778] W0515 22:35:47.721978   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.779] I0515 22:35:47.722102   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.779] I0515 22:35:47.722145   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.779] I0515 22:35:47.721807   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.779] I0515 22:35:47.724543   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.780] W0515 22:35:47.720346   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.780] W0515 22:35:47.722180   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.780] W0515 22:35:47.722331   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.781] W0515 22:35:47.722337   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.781] W0515 22:35:47.722356   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.782] W0515 22:35:47.722356   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.782] W0515 22:35:47.722381   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.782] W0515 22:35:47.722383   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.782] W0515 22:35:47.722539   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.783] W0515 22:35:47.722571   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.783] I0515 22:35:47.722574   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.783] I0515 22:35:47.722712   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.783] W0515 22:35:47.722766   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.784] I0515 22:35:47.727089   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.784] I0515 22:35:47.722769   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.784] I0515 22:35:47.722795   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.785] I0515 22:35:47.722853   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.785] I0515 22:35:47.727176   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.785] W0515 22:35:47.722855   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.786] W0515 22:35:47.723080   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.786] W0515 22:35:47.726779   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.786] W0515 22:35:47.726898   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:47.787] I0515 22:35:47.727115   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.787] I0515 22:35:47.727128   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:47.787] I0515 22:35:47.727428   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0515 22:35:47.888] No resources found
I0515 22:35:47.888] No resources found
I0515 22:35:47.888] +++ [0515 22:35:47] TESTS PASSED
I0515 22:35:47.894] junit report dir: /workspace/artifacts
I0515 22:35:47.899] +++ [0515 22:35:47] Clean up complete
W0515 22:35:47.999] + make test-integration
W0515 22:35:48.717] W0515 22:35:48.716312   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.719] W0515 22:35:48.716333   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.719] W0515 22:35:48.716410   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.719] W0515 22:35:48.716524   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.720] W0515 22:35:48.716819   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.728] W0515 22:35:48.719495   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.729] W0515 22:35:48.719567   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.729] W0515 22:35:48.719623   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.729] W0515 22:35:48.719572   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.730] W0515 22:35:48.719623   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.730] W0515 22:35:48.719665   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.730] W0515 22:35:48.719770   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.730] W0515 22:35:48.719771   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.731] W0515 22:35:48.719830   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.731] W0515 22:35:48.719830   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.731] W0515 22:35:48.719866   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.731] W0515 22:35:48.720154   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.732] W0515 22:35:48.720157   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.732] W0515 22:35:48.720211   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.732] W0515 22:35:48.720298   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.732] W0515 22:35:48.720336   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.733] W0515 22:35:48.720375   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.733] W0515 22:35:48.720403   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.733] W0515 22:35:48.720434   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.733] W0515 22:35:48.720463   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.734] W0515 22:35:48.720497   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.734] W0515 22:35:48.720533   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.734] W0515 22:35:48.720849   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.734] W0515 22:35:48.720941   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.735] W0515 22:35:48.720992   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.735] W0515 22:35:48.721033   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.735] W0515 22:35:48.721038   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.735] W0515 22:35:48.721058   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.735] W0515 22:35:48.721152   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.736] W0515 22:35:48.721074   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.736] W0515 22:35:48.721152   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.736] W0515 22:35:48.721107   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.736] W0515 22:35:48.721117   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.737] W0515 22:35:48.721184   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.737] W0515 22:35:48.721189   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.737] W0515 22:35:48.721221   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.738] W0515 22:35:48.721260   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.738] W0515 22:35:48.721265   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.738] W0515 22:35:48.721267   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.739] W0515 22:35:48.721285   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.739] W0515 22:35:48.721742   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.739] W0515 22:35:48.721814   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.739] W0515 22:35:48.721840   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.740] W0515 22:35:48.721860   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.740] W0515 22:35:48.721889   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.741] W0515 22:35:48.721912   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.741] W0515 22:35:48.721943   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.741] W0515 22:35:48.721956   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.742] W0515 22:35:48.721966   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.742] W0515 22:35:48.721916   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.742] W0515 22:35:48.721919   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.743] W0515 22:35:48.722022   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.743] W0515 22:35:48.722091   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.744] W0515 22:35:48.722093   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.744] W0515 22:35:48.722260   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.744] W0515 22:35:48.722296   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.745] W0515 22:35:48.722336   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.745] W0515 22:35:48.722378   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.745] W0515 22:35:48.722388   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.746] W0515 22:35:48.722426   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.746] W0515 22:35:48.722431   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.746] W0515 22:35:48.722521   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.747] W0515 22:35:48.722556   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.747] W0515 22:35:48.722727   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.747] W0515 22:35:48.722817   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.748] W0515 22:35:48.723106   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.748] W0515 22:35:48.723185   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.748] W0515 22:35:48.723204   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.749] W0515 22:35:48.723324   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:48.749] W0515 22:35:48.723620   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.008] W0515 22:35:50.007753   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.012] W0515 22:35:50.011601   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.013] W0515 22:35:50.012374   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.036] W0515 22:35:50.034995   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.041] W0515 22:35:50.040154   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.041] W0515 22:35:50.040174   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.043] W0515 22:35:50.043138   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.063] W0515 22:35:50.062139   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.072] W0515 22:35:50.071917   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.092] W0515 22:35:50.091884   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.095] W0515 22:35:50.094910   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.098] W0515 22:35:50.098358   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.100] W0515 22:35:50.099561   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.107] W0515 22:35:50.106585   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.125] W0515 22:35:50.124810   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.131] W0515 22:35:50.130423   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.138] W0515 22:35:50.137439   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.149] W0515 22:35:50.149025   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.159] W0515 22:35:50.158871   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.168] W0515 22:35:50.167806   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.181] W0515 22:35:50.180607   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.187] W0515 22:35:50.185990   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.193] W0515 22:35:50.192446   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.194] W0515 22:35:50.193860   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.212] W0515 22:35:50.211712   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.215] W0515 22:35:50.214908   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.217] W0515 22:35:50.217081   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.219] W0515 22:35:50.218577   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.261] W0515 22:35:50.260799   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.262] W0515 22:35:50.262327   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.276] W0515 22:35:50.273010   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.279] W0515 22:35:50.278611   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.294] W0515 22:35:50.293922   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.299] W0515 22:35:50.298552   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.299] W0515 22:35:50.298583   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.306] W0515 22:35:50.306238   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.322] W0515 22:35:50.320755   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.325] W0515 22:35:50.324538   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.341] W0515 22:35:50.340584   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.355] W0515 22:35:50.354264   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.359] W0515 22:35:50.358543   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.359] W0515 22:35:50.359035   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.360] W0515 22:35:50.359358   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.370] W0515 22:35:50.369330   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.371] W0515 22:35:50.370838   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.372] W0515 22:35:50.371651   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.399] W0515 22:35:50.398761   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.403] W0515 22:35:50.402830   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.425] W0515 22:35:50.424879   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.430] W0515 22:35:50.429433   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.432] W0515 22:35:50.431788   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.448] W0515 22:35:50.447407   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.449] W0515 22:35:50.449068   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.461] W0515 22:35:50.460775   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.471] W0515 22:35:50.470321   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.482] W0515 22:35:50.481350   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.484] W0515 22:35:50.483327   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.486] W0515 22:35:50.485913   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.501] W0515 22:35:50.500385   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.504] W0515 22:35:50.503210   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.511] W0515 22:35:50.510587   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.524] W0515 22:35:50.524182   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.527] W0515 22:35:50.526281   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.531] W0515 22:35:50.530503   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.538] W0515 22:35:50.537270   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.567] W0515 22:35:50.566749   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.570] W0515 22:35:50.569910   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.586] W0515 22:35:50.585006   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.599] W0515 22:35:50.598110   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.608] W0515 22:35:50.607834   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.612] W0515 22:35:50.611476   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.622] W0515 22:35:50.622011   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.626] W0515 22:35:50.625755   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.636] W0515 22:35:50.636126   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:50.639] W0515 22:35:50.638622   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.140] W0515 22:35:52.140185   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.190] W0515 22:35:52.190095   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.249] W0515 22:35:52.248569   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.251] W0515 22:35:52.250406   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.279] W0515 22:35:52.278769   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.311] W0515 22:35:52.311116   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.332] W0515 22:35:52.332126   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.342] W0515 22:35:52.341922   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.414] W0515 22:35:52.413439   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.419] W0515 22:35:52.418500   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.450] W0515 22:35:52.449750   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.474] W0515 22:35:52.473525   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.480] W0515 22:35:52.479698   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.516] W0515 22:35:52.515900   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.523] W0515 22:35:52.522684   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.527] W0515 22:35:52.527080   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.538] W0515 22:35:52.538090   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.574] W0515 22:35:52.573828   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.580] W0515 22:35:52.580455   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.600] W0515 22:35:52.599778   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.612] W0515 22:35:52.611492   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.642] W0515 22:35:52.642105   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.650] W0515 22:35:52.649613   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.659] W0515 22:35:52.659159   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.662] W0515 22:35:52.662294   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.680] W0515 22:35:52.679481   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.680] W0515 22:35:52.680298   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.692] W0515 22:35:52.691625   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.704] W0515 22:35:52.703421   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.714] W0515 22:35:52.713547   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.722] W0515 22:35:52.721352   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.722] W0515 22:35:52.721534   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.737] W0515 22:35:52.736988   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.747] W0515 22:35:52.746338   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.754] W0515 22:35:52.753446   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.772] W0515 22:35:52.772071   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.788] W0515 22:35:52.787708   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.815] W0515 22:35:52.814446   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.857] W0515 22:35:52.853088   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.871] W0515 22:35:52.870576   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.872] W0515 22:35:52.872207   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.876] W0515 22:35:52.875766   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.899] W0515 22:35:52.898298   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.901] W0515 22:35:52.900525   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.912] W0515 22:35:52.912309   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.945] W0515 22:35:52.945085   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.957] W0515 22:35:52.956387   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.968] W0515 22:35:52.967918   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.971] W0515 22:35:52.970491   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:52.994] W0515 22:35:52.993759   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:53.005] W0515 22:35:53.004677   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:53.022] W0515 22:35:53.021748   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:53.025] W0515 22:35:53.025198   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:53.051] W0515 22:35:53.050856   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:53.070] W0515 22:35:53.070039   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:53.075] W0515 22:35:53.074969   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:53.083] W0515 22:35:53.083060   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:53.107] W0515 22:35:53.106640   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:53.114] W0515 22:35:53.114189   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:53.130] W0515 22:35:53.130035   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:53.140] W0515 22:35:53.139665   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:53.141] W0515 22:35:53.140496   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:53.212] W0515 22:35:53.211950   47798 clientconn.go:1251] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0515 22:35:53.238] I0515 22:35:53.237568   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:53.242] I0515 22:35:53.241862   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:53.284] I0515 22:35:53.283534   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:53.315] I0515 22:35:53.315343   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:53.332] I0515 22:35:53.331787   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0515 22:35:53.433] +++ [0515 22:35:53] Checking etcd is on PATH
... skipping 25 lines ...
W0515 22:35:56.156] I0515 22:35:56.156166   47798 asm_amd64.s:1337] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0515 22:35:56.158] E0515 22:35:56.157577   47798 controller.go:179] StorageError: key not found, Code: 1, Key: /registry/masterleases/172.17.0.2, ResourceVersion: 0, AdditionalErrorMsg: 
I0515 22:35:58.808] Running tests for APIVersion: v1,admissionregistration.k8s.io/v1beta1,admission.k8s.io/v1beta1,apps/v1,apps/v1beta1,apps/v1beta2,auditregistration.k8s.io/v1alpha1,authentication.k8s.io/v1,authentication.k8s.io/v1beta1,authorization.k8s.io/v1,authorization.k8s.io/v1beta1,autoscaling/v1,autoscaling/v2beta1,autoscaling/v2beta2,batch/v1,batch/v1beta1,batch/v2alpha1,certificates.k8s.io/v1beta1,coordination.k8s.io/v1beta1,coordination.k8s.io/v1,extensions/v1beta1,events.k8s.io/v1beta1,imagepolicy.k8s.io/v1alpha1,networking.k8s.io/v1,networking.k8s.io/v1beta1,node.k8s.io/v1alpha1,node.k8s.io/v1beta1,policy/v1beta1,rbac.authorization.k8s.io/v1,rbac.authorization.k8s.io/v1beta1,rbac.authorization.k8s.io/v1alpha1,scheduling.k8s.io/v1alpha1,scheduling.k8s.io/v1beta1,scheduling.k8s.io/v1,settings.k8s.io/v1alpha1,storage.k8s.io/v1beta1,storage.k8s.io/v1,storage.k8s.io/v1alpha1,
I0515 22:35:58.852] +++ [0515 22:35:58] Running tests without code coverage
W0515 22:37:13.948] # k8s.io/kubernetes/test/e2e/common
W0515 22:37:13.948] test/e2e/common/container_probe.go:61:17: cannot assign to containerName
W0515 22:37:13.948] test/e2e/common/container_probe.go:166:2: undefined: framework.It
W0515 22:37:13.949] test/e2e/common/container_probe.go:304:42: cannot use port (type int) as type int32 in field value
W0515 22:37:13.949] test/e2e/common/container_probe.go:361:46: undefined: "k8s.io/kubernetes/vendor/k8s.io/api/core/v1".Hander
W0515 22:37:13.949] test/e2e/common/container_probe.go:365:24: cannot use port (type int32) as type int in argument to intstr.FromInt
I0515 22:49:51.316] ok  	k8s.io/kubernetes/test/integration/apimachinery	277.067s
I0515 22:49:51.316] ok  	k8s.io/kubernetes/test/integration/apiserver	86.904s
I0515 22:49:51.317] ok  	k8s.io/kubernetes/test/integration/apiserver/admissionwebhook	66.770s
I0515 22:49:51.317] ok  	k8s.io/kubernetes/test/integration/apiserver/apply	54.302s
I0515 22:49:51.317] FAIL	k8s.io/kubernetes/test/integration/auth [build failed]
I0515 22:49:51.317] ok  	k8s.io/kubernetes/test/integration/client	53.704s
I0515 22:49:51.317] ok  	k8s.io/kubernetes/test/integration/configmap	3.896s
I0515 22:49:51.317] ok  	k8s.io/kubernetes/test/integration/cronjob	24.956s
I0515 22:49:51.317] ok  	k8s.io/kubernetes/test/integration/daemonset	531.648s
I0515 22:49:51.317] ok  	k8s.io/kubernetes/test/integration/defaulttolerationseconds	3.781s
I0515 22:49:51.318] ok  	k8s.io/kubernetes/test/integration/deployment	208.835s
... skipping 25 lines ...
I0515 22:49:51.323] ok  	k8s.io/kubernetes/test/integration/storageclasses	4.095s
I0515 22:49:51.323] ok  	k8s.io/kubernetes/test/integration/tls	6.920s
I0515 22:49:51.323] ok  	k8s.io/kubernetes/test/integration/ttlcontroller	10.064s
I0515 22:49:51.323] ok  	k8s.io/kubernetes/test/integration/volume	94.528s
I0515 22:49:51.324] ok  	k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/test/integration	197.374s
I0515 22:50:06.688] +++ [0515 22:50:06] Saved JUnit XML test report to /workspace/artifacts/junit_d431ed5f68ae4ddf888439fb96b687a923412204_20190515-223558.xml
I0515 22:50:06.693] Makefile:185: recipe for target 'test' failed
I0515 22:50:06.704] +++ [0515 22:50:06] Cleaning up etcd
W0515 22:50:06.805] make[1]: *** [test] Error 1
W0515 22:50:06.805] !!! [0515 22:50:06] Call tree:
W0515 22:50:06.805] !!! [0515 22:50:06]  1: hack/make-rules/test-integration.sh:102 runTests(...)
I0515 22:50:07.330] +++ [0515 22:50:07] Integration test cleanup complete
I0515 22:50:07.330] Makefile:204: recipe for target 'test-integration' failed
W0515 22:50:07.431] make: *** [test-integration] Error 1
W0515 22:50:11.082] Traceback (most recent call last):
W0515 22:50:11.082]   File "/workspace/./test-infra/jenkins/../scenarios/kubernetes_verify.py", line 178, in <module>
W0515 22:50:11.082]     ARGS.exclude_typecheck, ARGS.exclude_godep)
W0515 22:50:11.082]   File "/workspace/./test-infra/jenkins/../scenarios/kubernetes_verify.py", line 140, in main
W0515 22:50:11.082]     check(*cmd)
W0515 22:50:11.083]   File "/workspace/./test-infra/jenkins/../scenarios/kubernetes_verify.py", line 48, in check
W0515 22:50:11.083]     subprocess.check_call(cmd)
W0515 22:50:11.083]   File "/usr/lib/python2.7/subprocess.py", line 186, in check_call
W0515 22:50:11.100]     raise CalledProcessError(retcode, cmd)
W0515 22:50:11.100] subprocess.CalledProcessError: Command '('docker', 'run', '--rm=true', '--privileged=true', '-v', '/var/run/docker.sock:/var/run/docker.sock', '-v', '/etc/localtime:/etc/localtime:ro', '-v', '/workspace/k8s.io/kubernetes:/go/src/k8s.io/kubernetes', '-v', '/workspace/k8s.io/:/workspace/k8s.io/', '-v', '/workspace/_artifacts:/workspace/artifacts', '-e', 'KUBE_FORCE_VERIFY_CHECKS=n', '-e', 'KUBE_VERIFY_GIT_BRANCH=master', '-e', 'EXCLUDE_TYPECHECK=n', '-e', 'EXCLUDE_GODEP=n', '-e', 'REPO_DIR=/workspace/k8s.io/kubernetes', '--tmpfs', '/tmp:exec,mode=1777', 'gcr.io/k8s-testimages/kubekins-test:1.14-v20190318-2ac98e338', 'bash', '-c', 'cd kubernetes && ./hack/jenkins/test-dockerized.sh')' returned non-zero exit status 2
E0515 22:50:11.109] Command failed
I0515 22:50:11.109] process 674 exited with code 1 after 28.7m
E0515 22:50:11.109] FAIL: pull-kubernetes-integration
I0515 22:50:11.110] Call:  gcloud auth activate-service-account --key-file=/etc/service-account/service-account.json
W0515 22:50:11.727] Activated service account credentials for: [pr-kubekins@kubernetes-jenkins-pull.iam.gserviceaccount.com]
I0515 22:50:11.772] process 112147 exited with code 0 after 0.0m
I0515 22:50:11.772] Call:  gcloud config get-value account
I0515 22:50:12.069] process 112159 exited with code 0 after 0.0m
I0515 22:50:12.069] Will upload results to gs://kubernetes-jenkins/pr-logs using pr-kubekins@kubernetes-jenkins-pull.iam.gserviceaccount.com
I0515 22:50:12.069] Upload result and artifacts...
I0515 22:50:12.070] Gubernator results at https://gubernator.k8s.io/build/kubernetes-jenkins/pr-logs/pull/70658/pull-kubernetes-integration/1128787191997140995
I0515 22:50:12.070] Call:  gsutil ls gs://kubernetes-jenkins/pr-logs/pull/70658/pull-kubernetes-integration/1128787191997140995/artifacts
W0515 22:50:13.250] CommandException: One or more URLs matched no objects.
E0515 22:50:13.380] Command failed
I0515 22:50:13.380] process 112171 exited with code 1 after 0.0m
W0515 22:50:13.380] Remote dir gs://kubernetes-jenkins/pr-logs/pull/70658/pull-kubernetes-integration/1128787191997140995/artifacts not exist yet
I0515 22:50:13.380] Call:  gsutil -m -q -o GSUtil:use_magicfile=True cp -r -c -z log,txt,xml /workspace/_artifacts gs://kubernetes-jenkins/pr-logs/pull/70658/pull-kubernetes-integration/1128787191997140995/artifacts
I0515 22:50:18.491] process 112313 exited with code 0 after 0.1m
W0515 22:50:18.492] metadata path /workspace/_artifacts/metadata.json does not exist
W0515 22:50:18.492] metadata not found or invalid, init with empty metadata
... skipping 22 lines ...