This job view page is being replaced by Spyglass soon. Check out the new job view.
ResultFAILURE
Tests 0 failed / 86 succeeded
Started2019-03-15 16:20
Elapsed13m56s
Revision
Buildergke-prow-containerd-pool-99179761-3t2l
links{u'resultstore': {u'url': u'https://source.cloud.google.com/results/invocations/123ba3a8-2328-4077-b55b-8e8ec0e644e6/targets/test'}}
pod131ee842-473e-11e9-ab9f-0a580a6c0a8e
resultstorehttps://source.cloud.google.com/results/invocations/123ba3a8-2328-4077-b55b-8e8ec0e644e6/targets/test
infra-commit26f7e332d
pod131ee842-473e-11e9-ab9f-0a580a6c0a8e
repok8s.io/kubernetes
repo-commitb0494b081d5c97c21115cd2921f7c5b536470591
repos{u'k8s.io/kubernetes': u'master'}

No Test Failures!


Show 86 Passed Tests

Error lines from build-log.txt

... skipping 304 lines ...
W0315 16:29:38.358] I0315 16:29:38.357782   55620 serving.go:312] Generated self-signed cert (/tmp/apiserver.crt, /tmp/apiserver.key)
W0315 16:29:38.359] I0315 16:29:38.357886   55620 server.go:559] external host was not specified, using 172.17.0.2
W0315 16:29:38.359] W0315 16:29:38.357902   55620 authentication.go:415] AnonymousAuth is not allowed with the AlwaysAllow authorizer. Resetting AnonymousAuth to false. You should use a different authorizer
W0315 16:29:38.359] I0315 16:29:38.358212   55620 server.go:146] Version: v1.15.0-alpha.0.1226+b0494b081d5c97
W0315 16:29:39.036] I0315 16:29:39.035294   55620 plugins.go:158] Loaded 4 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority.
W0315 16:29:39.036] I0315 16:29:39.035328   55620 plugins.go:161] Loaded 4 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ResourceQuota.
W0315 16:29:39.037] E0315 16:29:39.035935   55620 prometheus.go:138] failed to register depth metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 16:29:39.037] E0315 16:29:39.035984   55620 prometheus.go:150] failed to register adds metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 16:29:39.037] E0315 16:29:39.036099   55620 prometheus.go:162] failed to register latency metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 16:29:39.037] E0315 16:29:39.036161   55620 prometheus.go:174] failed to register work_duration metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 16:29:39.037] E0315 16:29:39.036201   55620 prometheus.go:189] failed to register unfinished_work_seconds metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 16:29:39.038] E0315 16:29:39.036225   55620 prometheus.go:202] failed to register longest_running_processor_microseconds metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 16:29:39.038] I0315 16:29:39.036258   55620 plugins.go:158] Loaded 4 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority.
W0315 16:29:39.038] I0315 16:29:39.036268   55620 plugins.go:161] Loaded 4 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ResourceQuota.
W0315 16:29:39.038] I0315 16:29:39.038045   55620 clientconn.go:551] parsed scheme: ""
W0315 16:29:39.039] I0315 16:29:39.038068   55620 clientconn.go:557] scheme "" not registered, fallback to default scheme
W0315 16:29:39.039] I0315 16:29:39.038125   55620 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
W0315 16:29:39.039] I0315 16:29:39.038292   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
... skipping 361 lines ...
W0315 16:29:39.579] W0315 16:29:39.578226   55620 genericapiserver.go:344] Skipping API storage.k8s.io/v1alpha1 because it has no resources.
W0315 16:29:40.034] I0315 16:29:40.034195   55620 clientconn.go:551] parsed scheme: ""
W0315 16:29:40.035] I0315 16:29:40.034236   55620 clientconn.go:557] scheme "" not registered, fallback to default scheme
W0315 16:29:40.035] I0315 16:29:40.034386   55620 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
W0315 16:29:40.035] I0315 16:29:40.034536   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:29:40.036] I0315 16:29:40.035158   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:29:40.600] E0315 16:29:40.599483   55620 prometheus.go:138] failed to register depth metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 16:29:40.600] E0315 16:29:40.599535   55620 prometheus.go:150] failed to register adds metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 16:29:40.601] E0315 16:29:40.599588   55620 prometheus.go:162] failed to register latency metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 16:29:40.601] E0315 16:29:40.599653   55620 prometheus.go:174] failed to register work_duration metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 16:29:40.602] E0315 16:29:40.599676   55620 prometheus.go:189] failed to register unfinished_work_seconds metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 16:29:40.602] E0315 16:29:40.599695   55620 prometheus.go:202] failed to register longest_running_processor_microseconds metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 16:29:40.602] I0315 16:29:40.599729   55620 plugins.go:158] Loaded 4 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority.
W0315 16:29:40.602] I0315 16:29:40.599736   55620 plugins.go:161] Loaded 4 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ResourceQuota.
W0315 16:29:40.603] I0315 16:29:40.601698   55620 clientconn.go:551] parsed scheme: ""
W0315 16:29:40.603] I0315 16:29:40.601726   55620 clientconn.go:557] scheme "" not registered, fallback to default scheme
W0315 16:29:40.603] I0315 16:29:40.601774   55620 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
W0315 16:29:40.603] I0315 16:29:40.601898   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
... skipping 280 lines ...
W0315 16:30:27.512] I0315 16:30:27.507480   58922 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for deployments.extensions
W0315 16:30:27.512] I0315 16:30:27.507527   58922 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for cronjobs.batch
W0315 16:30:27.512] I0315 16:30:27.507561   58922 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for roles.rbac.authorization.k8s.io
W0315 16:30:27.512] I0315 16:30:27.507612   58922 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for podtemplates
W0315 16:30:27.513] I0315 16:30:27.507640   58922 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for daemonsets.apps
W0315 16:30:27.513] I0315 16:30:27.507666   58922 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for jobs.batch
W0315 16:30:27.513] E0315 16:30:27.507692   58922 resource_quota_controller.go:171] initial monitor sync has error: couldn't start monitor for resource "extensions/v1beta1, Resource=networkpolicies": unable to monitor quota for resource "extensions/v1beta1, Resource=networkpolicies"
W0315 16:30:27.513] I0315 16:30:27.507721   58922 controllermanager.go:497] Started "resourcequota"
W0315 16:30:27.513] I0315 16:30:27.507949   58922 resource_quota_controller.go:276] Starting resource quota controller
W0315 16:30:27.514] I0315 16:30:27.507988   58922 controller_utils.go:1027] Waiting for caches to sync for resource quota controller
W0315 16:30:27.514] I0315 16:30:27.508039   58922 resource_quota_monitor.go:301] QuotaMonitor running
W0315 16:30:27.514] I0315 16:30:27.508740   58922 controllermanager.go:497] Started "horizontalpodautoscaling"
W0315 16:30:27.514] I0315 16:30:27.508896   58922 horizontal.go:156] Starting HPA controller
... skipping 19 lines ...
W0315 16:30:27.523] I0315 16:30:27.522546   58922 taint_manager.go:175] Sending events to api server.
W0315 16:30:27.523] I0315 16:30:27.522908   58922 node_lifecycle_controller.go:390] Controller will reconcile labels.
W0315 16:30:27.523] I0315 16:30:27.523005   58922 node_lifecycle_controller.go:403] Controller will taint node by condition.
W0315 16:30:27.523] I0315 16:30:27.523074   58922 controllermanager.go:497] Started "nodelifecycle"
W0315 16:30:27.523] I0315 16:30:27.523131   58922 node_lifecycle_controller.go:427] Starting node controller
W0315 16:30:27.524] I0315 16:30:27.523152   58922 controller_utils.go:1027] Waiting for caches to sync for taint controller
W0315 16:30:27.527] E0315 16:30:27.526573   58922 core.go:77] Failed to start service controller: WARNING: no cloud provider provided, services of type LoadBalancer will fail
W0315 16:30:27.527] W0315 16:30:27.526600   58922 controllermanager.go:489] Skipping "service"
W0315 16:30:27.527] I0315 16:30:27.527453   58922 node_lifecycle_controller.go:77] Sending events to api server
W0315 16:30:27.527] E0315 16:30:27.527574   58922 core.go:161] failed to start cloud node lifecycle controller: no cloud provider provided
W0315 16:30:27.528] W0315 16:30:27.527588   58922 controllermanager.go:489] Skipping "cloud-node-lifecycle"
W0315 16:30:27.531] I0315 16:30:27.530638   58922 controllermanager.go:497] Started "replicationcontroller"
W0315 16:30:27.531] I0315 16:30:27.530682   58922 replica_set.go:182] Starting replicationcontroller controller
W0315 16:30:27.531] I0315 16:30:27.530699   58922 controller_utils.go:1027] Waiting for caches to sync for ReplicationController controller
W0315 16:30:27.531] I0315 16:30:27.531174   58922 controllermanager.go:497] Started "serviceaccount"
W0315 16:30:27.531] I0315 16:30:27.531299   58922 serviceaccounts_controller.go:115] Starting service account controller
... skipping 37 lines ...
W0315 16:30:27.540] W0315 16:30:27.539631   58922 controllermanager.go:489] Skipping "csrsigning"
W0315 16:30:27.540] W0315 16:30:27.539637   58922 controllermanager.go:476] "bootstrapsigner" is disabled
W0315 16:30:27.540] W0315 16:30:27.539642   58922 controllermanager.go:476] "tokencleaner" is disabled
W0315 16:30:27.540] W0315 16:30:27.539649   58922 controllermanager.go:489] Skipping "nodeipam"
W0315 16:30:27.540] I0315 16:30:27.540044   58922 clusterroleaggregation_controller.go:148] Starting ClusterRoleAggregator
W0315 16:30:27.541] I0315 16:30:27.540063   58922 controller_utils.go:1027] Waiting for caches to sync for ClusterRoleAggregator controller
W0315 16:30:27.593] W0315 16:30:27.592480   58922 actual_state_of_world.go:503] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="127.0.0.1" does not exist
W0315 16:30:27.605] I0315 16:30:27.604525   58922 controller_utils.go:1034] Caches are synced for ReplicaSet controller
W0315 16:30:27.614] I0315 16:30:27.613548   58922 controller_utils.go:1034] Caches are synced for GC controller
W0315 16:30:27.624] I0315 16:30:27.623488   58922 controller_utils.go:1034] Caches are synced for taint controller
W0315 16:30:27.624] I0315 16:30:27.623649   58922 node_lifecycle_controller.go:1159] Initializing eviction metric for zone: 
W0315 16:30:27.625] I0315 16:30:27.623733   58922 node_lifecycle_controller.go:1009] Controller detected that all Nodes are not-Ready. Entering master disruption mode.
W0315 16:30:27.625] I0315 16:30:27.623784   58922 taint_manager.go:198] Starting NoExecuteTaintManager
W0315 16:30:27.625] I0315 16:30:27.623782   58922 event.go:209] Event(v1.ObjectReference{Kind:"Node", Namespace:"", Name:"127.0.0.1", UID:"a41929b9-473f-11e9-a548-0242ac110002", APIVersion:"", ResourceVersion:"", FieldPath:""}): type: 'Normal' reason: 'RegisteredNode' Node 127.0.0.1 event: Registered Node 127.0.0.1 in Controller
W0315 16:30:27.633] I0315 16:30:27.632400   58922 controller_utils.go:1034] Caches are synced for certificate controller
W0315 16:30:27.635] I0315 16:30:27.634967   58922 controller_utils.go:1034] Caches are synced for TTL controller
W0315 16:30:27.639] I0315 16:30:27.638865   58922 controller_utils.go:1034] Caches are synced for deployment controller
W0315 16:30:27.640] I0315 16:30:27.640228   58922 controller_utils.go:1034] Caches are synced for ClusterRoleAggregator controller
W0315 16:30:27.667] E0315 16:30:27.666197   58922 clusterroleaggregation_controller.go:180] edit failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "edit": the object has been modified; please apply your changes to the latest version and try again
W0315 16:30:27.684] I0315 16:30:27.683407   58922 controller_utils.go:1034] Caches are synced for endpoint controller
W0315 16:30:27.770] The Service "kubernetes" is invalid: spec.clusterIP: Invalid value: "10.0.0.1": provided IP is already allocated
W0315 16:30:27.809] I0315 16:30:27.809170   58922 controller_utils.go:1034] Caches are synced for HPA controller
I0315 16:30:27.910] NAME         TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
I0315 16:30:27.910] kubernetes   ClusterIP   10.0.0.1     <none>        443/TCP   41s
I0315 16:30:27.910] Recording: run_kubectl_version_tests
... skipping 55 lines ...
I0315 16:30:29.001] +++ Running case: test-cmd.run_kubectl_config_set_tests 
I0315 16:30:29.003] +++ working dir: /go/src/k8s.io/kubernetes
I0315 16:30:29.006] +++ command: run_kubectl_config_set_tests
I0315 16:30:29.019] +++ [0315 16:30:29] Creating namespace namespace-1552667429-30549
W0315 16:30:29.119] I0315 16:30:29.032304   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:30:29.214] I0315 16:30:29.032525   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 16:30:29.215] E0315 16:30:29.206737   58922 resource_quota_controller.go:437] failed to sync resource monitors: couldn't start monitor for resource "extensions/v1beta1, Resource=networkpolicies": unable to monitor quota for resource "extensions/v1beta1, Resource=networkpolicies"
I0315 16:30:29.316] namespace/namespace-1552667429-30549 created
I0315 16:30:29.316] Context "test" modified.
I0315 16:30:29.316] +++ [0315 16:30:29] Testing kubectl(v1:config set)
I0315 16:30:29.378] Cluster "test-cluster" set.
I0315 16:30:29.460] Property "clusters.test-cluster.certificate-authority-data" set.
I0315 16:30:29.619] Property "clusters.test-cluster.certificate-authority-data" set.
... skipping 36 lines ...
I0315 16:30:31.787] +++ [0315 16:30:31] Creating namespace namespace-1552667431-12022
I0315 16:30:31.861] namespace/namespace-1552667431-12022 created
I0315 16:30:31.941] Context "test" modified.
I0315 16:30:31.947] +++ [0315 16:30:31] Testing RESTMapper
W0315 16:30:32.048] I0315 16:30:32.033736   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:30:32.049] I0315 16:30:32.033996   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 16:30:32.149] +++ [0315 16:30:32] "kubectl get unknownresourcetype" returns error as expected: error: the server doesn't have a resource type "unknownresourcetype"
I0315 16:30:32.150] +++ exit code: 0
I0315 16:30:32.234] NAME                              SHORTNAMES   APIGROUP                       NAMESPACED   KIND
I0315 16:30:32.235] bindings                                                                      true         Binding
I0315 16:30:32.235] componentstatuses                 cs                                          false        ComponentStatus
I0315 16:30:32.235] configmaps                        cm                                          true         ConfigMap
I0315 16:30:32.235] endpoints                         ep                                          true         Endpoints
... skipping 702 lines ...
I0315 16:30:55.330] poddisruptionbudget.policy/test-pdb-3 created
I0315 16:30:55.428] core.sh:251: Successful get pdb/test-pdb-3 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 2
I0315 16:30:55.504] poddisruptionbudget.policy/test-pdb-4 created
I0315 16:30:55.604] core.sh:255: Successful get pdb/test-pdb-4 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 50%
I0315 16:30:55.761] core.sh:261: Successful get pods --namespace=test-kubectl-describe-pod {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 16:30:55.963] pod/env-test-pod created
W0315 16:30:56.063] error: resource(s) were provided, but no name, label selector, or --all flag specified
W0315 16:30:56.064] error: setting 'all' parameter but found a non empty selector. 
W0315 16:30:56.064] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0315 16:30:56.064] I0315 16:30:54.045527   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:30:56.064] I0315 16:30:54.045737   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 16:30:56.064] I0315 16:30:54.983489   55620 controller.go:606] quota admission added evaluator for: poddisruptionbudgets.policy
W0315 16:30:56.065] I0315 16:30:55.046011   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:30:56.065] I0315 16:30:55.046216   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 16:30:56.065] error: min-available and max-unavailable cannot be both specified
W0315 16:30:56.065] I0315 16:30:56.046453   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:30:56.066] I0315 16:30:56.046627   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 16:30:56.173] core.sh:264: Successful describe pods --namespace=test-kubectl-describe-pod env-test-pod:
I0315 16:30:56.174] Name:               env-test-pod
I0315 16:30:56.174] Namespace:          test-kubectl-describe-pod
I0315 16:30:56.174] Priority:           0
... skipping 177 lines ...
W0315 16:31:09.266] I0315 16:31:09.054251   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:31:09.266] I0315 16:31:09.054873   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 16:31:09.476] core.sh:434: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 16:31:09.659] pod/valid-pod created
I0315 16:31:09.787] core.sh:438: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0315 16:31:09.968] Successful
I0315 16:31:09.969] message:Error from server: cannot restore map from string
I0315 16:31:09.969] has:cannot restore map from string
W0315 16:31:10.070] E0315 16:31:09.958657   55620 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"cannot restore map from string"}
W0315 16:31:10.070] I0315 16:31:10.055154   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:31:10.070] I0315 16:31:10.055396   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 16:31:10.171] Successful
I0315 16:31:10.171] message:pod/valid-pod patched (no change)
I0315 16:31:10.171] has:patched (no change)
I0315 16:31:10.192] pod/valid-pod patched
... skipping 8 lines ...
I0315 16:31:11.120] pod/valid-pod patched
W0315 16:31:11.221] I0315 16:31:11.055689   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:31:11.221] I0315 16:31:11.056406   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 16:31:11.322] core.sh:475: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:3.1:
I0315 16:31:11.423] pod/valid-pod patched
I0315 16:31:11.532] core.sh:491: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
I0315 16:31:11.719] +++ [0315 16:31:11] "kubectl patch with resourceVersion 503" returns error as expected: Error from server (Conflict): Operation cannot be fulfilled on pods "valid-pod": the object has been modified; please apply your changes to the latest version and try again
I0315 16:31:11.999] pod "valid-pod" deleted
I0315 16:31:12.012] pod/valid-pod replaced
I0315 16:31:12.117] core.sh:515: Successful get pod valid-pod {{(index .spec.containers 0).name}}: replaced-k8s-serve-hostname
I0315 16:31:12.314] Successful
I0315 16:31:12.314] message:error: --grace-period must have --force specified
I0315 16:31:12.314] has:\-\-grace-period must have \-\-force specified
W0315 16:31:12.415] I0315 16:31:12.056740   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:31:12.415] I0315 16:31:12.056991   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 16:31:12.516] Successful
I0315 16:31:12.516] message:error: --timeout must have --force specified
I0315 16:31:12.517] has:\-\-timeout must have \-\-force specified
I0315 16:31:12.658] node/node-v1-test created
W0315 16:31:12.759] W0315 16:31:12.657877   58922 actual_state_of_world.go:503] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="node-v1-test" does not exist
I0315 16:31:12.860] node/node-v1-test replaced
I0315 16:31:12.931] core.sh:552: Successful get node node-v1-test {{.metadata.annotations.a}}: b
I0315 16:31:13.010] node "node-v1-test" deleted
I0315 16:31:13.113] core.sh:559: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
I0315 16:31:13.439] core.sh:562: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/serve_hostname:
I0315 16:31:14.447] core.sh:575: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
... skipping 34 lines ...
W0315 16:31:16.378] Edit cancelled, no changes made.
W0315 16:31:16.378] Edit cancelled, no changes made.
W0315 16:31:16.378] I0315 16:31:14.057728   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:31:16.379] I0315 16:31:14.058024   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 16:31:16.379] Edit cancelled, no changes made.
W0315 16:31:16.379] Edit cancelled, no changes made.
W0315 16:31:16.379] error: 'name' already has a value (valid-pod), and --overwrite is false
W0315 16:31:16.379] I0315 16:31:15.058343   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:31:16.379] I0315 16:31:15.058563   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 16:31:16.379] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0315 16:31:16.380] I0315 16:31:16.058908   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:31:16.380] I0315 16:31:16.059134   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 16:31:16.480] core.sh:622: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
... skipping 86 lines ...
I0315 16:31:23.165] +++ Running case: test-cmd.run_kubectl_create_error_tests 
I0315 16:31:23.165] +++ working dir: /go/src/k8s.io/kubernetes
I0315 16:31:23.166] +++ command: run_kubectl_create_error_tests
I0315 16:31:23.176] +++ [0315 16:31:23] Creating namespace namespace-1552667483-27534
I0315 16:31:23.265] namespace/namespace-1552667483-27534 created
I0315 16:31:23.345] Context "test" modified.
I0315 16:31:23.351] +++ [0315 16:31:23] Testing kubectl create with error
W0315 16:31:23.452] Error: must specify one of -f and -k
W0315 16:31:23.452] 
W0315 16:31:23.453] Create a resource from a file or from stdin.
W0315 16:31:23.453] 
W0315 16:31:23.453]  JSON and YAML formats are accepted.
W0315 16:31:23.453] 
W0315 16:31:23.453] Examples:
... skipping 41 lines ...
W0315 16:31:23.460] 
W0315 16:31:23.460] Usage:
W0315 16:31:23.460]   kubectl create -f FILENAME [options]
W0315 16:31:23.460] 
W0315 16:31:23.460] Use "kubectl <command> --help" for more information about a given command.
W0315 16:31:23.460] Use "kubectl options" for a list of global command-line options (applies to all commands).
I0315 16:31:23.595] +++ [0315 16:31:23] "kubectl create with empty string list returns error as expected: error: error validating "hack/testdata/invalid-rc-with-empty-args.yaml": error validating data: ValidationError(ReplicationController.spec.template.spec.containers[0].args): unknown object type "nil" in ReplicationController.spec.template.spec.containers[0].args[0]; if you choose to ignore these errors, turn validation off with --validate=false
W0315 16:31:23.696] kubectl convert is DEPRECATED and will be removed in a future version.
W0315 16:31:23.696] In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
I0315 16:31:23.797] +++ exit code: 0
W0315 16:31:24.064] I0315 16:31:24.063961   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:31:24.065] I0315 16:31:24.064199   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 16:31:24.732] Recording: run_kubectl_apply_tests
... skipping 29 lines ...
W0315 16:31:27.626] I0315 16:31:27.626167   55620 clientconn.go:551] parsed scheme: ""
W0315 16:31:27.626] I0315 16:31:27.626205   55620 clientconn.go:557] scheme "" not registered, fallback to default scheme
W0315 16:31:27.627] I0315 16:31:27.626264   55620 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
W0315 16:31:27.627] I0315 16:31:27.626334   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:31:27.627] I0315 16:31:27.626940   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:31:27.629] I0315 16:31:27.629356   55620 controller.go:606] quota admission added evaluator for: resources.mygroup.example.com
W0315 16:31:27.721] Error from server (NotFound): resources.mygroup.example.com "myobj" not found
I0315 16:31:27.821] kind.mygroup.example.com/myobj serverside-applied (server dry run)
I0315 16:31:27.822] customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
I0315 16:31:27.835] +++ exit code: 0
W0315 16:31:28.068] I0315 16:31:28.068223   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:31:28.069] I0315 16:31:28.068444   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 16:31:28.937] Recording: run_kubectl_run_tests
... skipping 108 lines ...
I0315 16:31:32.908] Context "test" modified.
I0315 16:31:32.914] +++ [0315 16:31:32] Testing kubectl create filter
I0315 16:31:33.016] create.sh:30: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 16:31:33.198] pod/selector-test-pod created
I0315 16:31:33.292] create.sh:34: Successful get pods selector-test-pod {{.metadata.labels.name}}: selector-test-pod
I0315 16:31:33.374] Successful
I0315 16:31:33.374] message:Error from server (NotFound): pods "selector-test-pod-dont-apply" not found
I0315 16:31:33.374] has:pods "selector-test-pod-dont-apply" not found
I0315 16:31:33.451] pod "selector-test-pod" deleted
I0315 16:31:33.470] +++ exit code: 0
W0315 16:31:33.570] I0315 16:31:33.070252   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:31:33.571] I0315 16:31:33.070608   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 16:31:34.071] I0315 16:31:34.070912   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
... skipping 47 lines ...
W0315 16:31:37.212] I0315 16:31:37.118170   58922 event.go:209] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1552667494-2083", Name:"nginx", UID:"cdb2a3e8-473f-11e9-a548-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"604", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-776cc67f78 to 3
W0315 16:31:37.212] I0315 16:31:37.125489   58922 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552667494-2083", Name:"nginx-776cc67f78", UID:"cdb3c1eb-473f-11e9-a548-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"605", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-776cc67f78-7fk8x
W0315 16:31:37.213] I0315 16:31:37.131077   58922 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552667494-2083", Name:"nginx-776cc67f78", UID:"cdb3c1eb-473f-11e9-a548-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"605", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-776cc67f78-22526
W0315 16:31:37.213] I0315 16:31:37.139323   58922 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552667494-2083", Name:"nginx-776cc67f78", UID:"cdb3c1eb-473f-11e9-a548-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"605", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-776cc67f78-lslrm
I0315 16:31:37.314] apps.sh:147: Successful get deployment nginx {{.metadata.name}}: nginx
I0315 16:31:42.080] Successful
I0315 16:31:42.080] message:Error from server (Conflict): error when applying patch:
I0315 16:31:42.081] {"metadata":{"annotations":{"kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"extensions/v1beta1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1552667494-2083\",\"resourceVersion\":\"99\"},\"spec\":{\"replicas\":3,\"selector\":{\"matchLabels\":{\"name\":\"nginx2\"}},\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx2\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n"},"resourceVersion":"99"},"spec":{"selector":{"matchLabels":{"name":"nginx2"}},"template":{"metadata":{"labels":{"name":"nginx2"}}}}}
I0315 16:31:42.081] to:
I0315 16:31:42.081] Resource: "extensions/v1beta1, Resource=deployments", GroupVersionKind: "extensions/v1beta1, Kind=Deployment"
I0315 16:31:42.081] Name: "nginx", Namespace: "namespace-1552667494-2083"
I0315 16:31:42.084] Object: &{map["kind":"Deployment" "apiVersion":"extensions/v1beta1" "metadata":map["selfLink":"/apis/extensions/v1beta1/namespaces/namespace-1552667494-2083/deployments/nginx" "uid":"cdb2a3e8-473f-11e9-a548-0242ac110002" "managedFields":[map["operation":"Update" "apiVersion":"apps/v1" "time":"2019-03-15T16:31:37Z" "fields":map["f:metadata":map["f:annotations":map["f:deployment.kubernetes.io/revision":map[]]] "f:status":map["f:unavailableReplicas":map[] "f:updatedReplicas":map[] "f:conditions":map[".":map[] "k:{\"type\":\"Available\"}":map["f:lastTransitionTime":map[] "f:lastUpdateTime":map[] "f:message":map[] "f:reason":map[] "f:status":map[] "f:type":map[] ".":map[]]] "f:observedGeneration":map[] "f:replicas":map[]]] "manager":"kube-controller-manager"] map["time":"2019-03-15T16:31:37Z" "fields":map["f:metadata":map["f:labels":map[".":map[] "f:name":map[]] "f:annotations":map[".":map[] "f:kubectl.kubernetes.io/last-applied-configuration":map[]]] "f:spec":map["f:replicas":map[] "f:revisionHistoryLimit":map[] "f:selector":map["f:matchLabels":map[".":map[] "f:name":map[]] ".":map[]] "f:strategy":map["f:rollingUpdate":map[".":map[] "f:maxSurge":map[] "f:maxUnavailable":map[]] "f:type":map[]] "f:template":map["f:metadata":map["f:labels":map[".":map[] "f:name":map[]]] "f:spec":map["f:containers":map["k:{\"name\":\"nginx\"}":map["f:image":map[] "f:imagePullPolicy":map[] "f:name":map[] "f:ports":map[".":map[] "k:{\"containerPort\":80,\"protocol\":\"TCP\"}":map[".":map[] "f:containerPort":map[] "f:protocol":map[]]] "f:resources":map[] "f:terminationMessagePath":map[] "f:terminationMessagePolicy":map[] ".":map[]]] "f:dnsPolicy":map[] "f:restartPolicy":map[] "f:schedulerName":map[] "f:securityContext":map[] "f:terminationGracePeriodSeconds":map[]]] "f:progressDeadlineSeconds":map[]]] "manager":"kubectl" "operation":"Update" "apiVersion":"extensions/v1beta1"]] "annotations":map["deployment.kubernetes.io/revision":"1" "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"extensions/v1beta1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1552667494-2083\"},\"spec\":{\"replicas\":3,\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx1\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n"] "name":"nginx" "namespace":"namespace-1552667494-2083" "resourceVersion":"617" "generation":'\x01' "creationTimestamp":"2019-03-15T16:31:37Z" "labels":map["name":"nginx"]] "spec":map["replicas":'\x03' "selector":map["matchLabels":map["name":"nginx1"]] "template":map["metadata":map["creationTimestamp":<nil> "labels":map["name":"nginx1"]] "spec":map["dnsPolicy":"ClusterFirst" "securityContext":map[] "schedulerName":"default-scheduler" "containers":[map["resources":map[] "terminationMessagePath":"/dev/termination-log" "terminationMessagePolicy":"File" "imagePullPolicy":"IfNotPresent" "name":"nginx" "image":"k8s.gcr.io/nginx:test-cmd" "ports":[map["containerPort":'P' "protocol":"TCP"]]]] "restartPolicy":"Always" "terminationGracePeriodSeconds":'\x1e']] "strategy":map["type":"RollingUpdate" "rollingUpdate":map["maxUnavailable":'\x01' "maxSurge":'\x01']] "revisionHistoryLimit":%!q(int64=+2147483647) "progressDeadlineSeconds":%!q(int64=+2147483647)] "status":map["conditions":[map["type":"Available" "status":"False" "lastUpdateTime":"2019-03-15T16:31:37Z" "lastTransitionTime":"2019-03-15T16:31:37Z" "reason":"MinimumReplicasUnavailable" "message":"Deployment does not have minimum availability."]] "observedGeneration":'\x01' "replicas":'\x03' "updatedReplicas":'\x03' "unavailableReplicas":'\x03']]}
I0315 16:31:42.084] for: "hack/testdata/deployment-label-change2.yaml": Operation cannot be fulfilled on deployments.extensions "nginx": the object has been modified; please apply your changes to the latest version and try again
I0315 16:31:42.084] has:Error from server (Conflict)
W0315 16:31:42.185] I0315 16:31:37.394106   58922 horizontal.go:320] Horizontal Pod Autoscaler frontend has been deleted in namespace-1552667480-6384
W0315 16:31:42.185] I0315 16:31:38.073076   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:31:42.185] I0315 16:31:38.073300   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 16:31:42.186] I0315 16:31:39.073643   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:31:42.186] I0315 16:31:39.073932   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 16:31:42.186] I0315 16:31:40.016589   55620 trace.go:81] Trace[1256579002]: "Get /apis/extensions/v1beta1/namespaces/namespace-1552667494-2083/deployments/nginx" (started: 2019-03-15 16:31:39.506012533 +0000 UTC m=+122.183233329) (total time: 510.54203ms):
... skipping 29 lines ...
W0315 16:31:49.080] I0315 16:31:49.079827   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:31:49.081] I0315 16:31:49.080097   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 16:31:50.081] I0315 16:31:50.080307   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:31:50.081] I0315 16:31:50.080653   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 16:31:51.082] I0315 16:31:51.080941   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:31:51.082] I0315 16:31:51.081192   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 16:31:51.856] E0315 16:31:51.856274   58922 replica_set.go:450] Sync "namespace-1552667494-2083/nginx-7bd4fbc645" failed with replicasets.apps "nginx-7bd4fbc645" not found
W0315 16:31:52.082] I0315 16:31:52.081436   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:31:52.083] I0315 16:31:52.081749   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 16:31:52.838] I0315 16:31:52.837210   58922 event.go:209] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1552667494-2083", Name:"nginx", UID:"d7116c4c-473f-11e9-a548-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"671", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-7bd4fbc645 to 3
W0315 16:31:52.845] I0315 16:31:52.844393   58922 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552667494-2083", Name:"nginx-7bd4fbc645", UID:"d712626b-473f-11e9-a548-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"672", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-7bd4fbc645-wkgtg
W0315 16:31:52.851] I0315 16:31:52.851128   58922 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552667494-2083", Name:"nginx-7bd4fbc645", UID:"d712626b-473f-11e9-a548-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"672", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-7bd4fbc645-k6lgh
W0315 16:31:52.854] I0315 16:31:52.854233   58922 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552667494-2083", Name:"nginx-7bd4fbc645", UID:"d712626b-473f-11e9-a548-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"672", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-7bd4fbc645-njtgp
... skipping 195 lines ...
I0315 16:31:58.014] +++ [0315 16:31:58] Creating namespace namespace-1552667518-2717
I0315 16:31:58.095] namespace/namespace-1552667518-2717 created
I0315 16:31:58.173] Context "test" modified.
I0315 16:31:58.181] +++ [0315 16:31:58] Testing kubectl get
I0315 16:31:58.286] get.sh:29: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 16:31:58.378] Successful
I0315 16:31:58.378] message:Error from server (NotFound): pods "abc" not found
I0315 16:31:58.379] has:pods "abc" not found
I0315 16:31:58.482] get.sh:37: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 16:31:58.580] Successful
I0315 16:31:58.580] message:Error from server (NotFound): pods "abc" not found
I0315 16:31:58.580] has:pods "abc" not found
I0315 16:31:58.679] get.sh:45: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 16:31:58.769] Successful
I0315 16:31:58.769] message:{
I0315 16:31:58.770]     "apiVersion": "v1",
I0315 16:31:58.770]     "items": [],
... skipping 27 lines ...
I0315 16:31:59.255] has not:No resources found
I0315 16:31:59.279] Successful
I0315 16:31:59.279] message:NAME
I0315 16:31:59.280] has not:No resources found
I0315 16:31:59.403] get.sh:73: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 16:31:59.549] Successful
I0315 16:31:59.550] message:error: the server doesn't have a resource type "foobar"
I0315 16:31:59.550] has not:No resources found
I0315 16:31:59.667] Successful
I0315 16:31:59.667] message:No resources found.
I0315 16:31:59.668] has:No resources found
I0315 16:31:59.785] Successful
I0315 16:31:59.786] message:
I0315 16:31:59.786] has not:No resources found
I0315 16:31:59.891] Successful
I0315 16:31:59.891] message:No resources found.
I0315 16:31:59.891] has:No resources found
I0315 16:32:00.000] get.sh:93: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 16:32:00.108] Successful
I0315 16:32:00.109] message:Error from server (NotFound): pods "abc" not found
I0315 16:32:00.109] has:pods "abc" not found
I0315 16:32:00.111] FAIL!
I0315 16:32:00.112] message:Error from server (NotFound): pods "abc" not found
I0315 16:32:00.112] has not:List
I0315 16:32:00.112] 99 /go/src/k8s.io/kubernetes/test/cmd/../../test/cmd/get.sh
W0315 16:32:00.212] I0315 16:32:00.086394   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:32:00.213] I0315 16:32:00.086645   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 16:32:00.313] Successful
I0315 16:32:00.314] message:I0315 16:32:00.197661   69855 loader.go:359] Config loaded from file /tmp/tmp.ar74ot6XKi/.kube/config
... skipping 715 lines ...
I0315 16:32:04.060] }
I0315 16:32:04.142] get.sh:155: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0315 16:32:04.399] <no value>Successful
I0315 16:32:04.400] message:valid-pod:
I0315 16:32:04.400] has:valid-pod:
I0315 16:32:04.476] Successful
I0315 16:32:04.476] message:error: error executing jsonpath "{.missing}": Error executing template: missing is not found. Printing more information for debugging the template:
I0315 16:32:04.476] 	template was:
I0315 16:32:04.476] 		{.missing}
I0315 16:32:04.476] 	object given to jsonpath engine was:
I0315 16:32:04.478] 		map[string]interface {}{"kind":"Pod", "apiVersion":"v1", "metadata":map[string]interface {}{"name":"valid-pod", "namespace":"namespace-1552667523-23877", "selfLink":"/api/v1/namespaces/namespace-1552667523-23877/pods/valid-pod", "uid":"ddb26355-473f-11e9-a548-0242ac110002", "resourceVersion":"713", "creationTimestamp":"2019-03-15T16:32:03Z", "labels":map[string]interface {}{"name":"valid-pod"}, "managedFields":[]interface {}{map[string]interface {}{"operation":"Update", "apiVersion":"v1", "time":"2019-03-15T16:32:03Z", "fields":map[string]interface {}{"f:metadata":map[string]interface {}{"f:labels":map[string]interface {}{".":map[string]interface {}{}, "f:name":map[string]interface {}{}}}, "f:spec":map[string]interface {}{"f:enableServiceLinks":map[string]interface {}{}, "f:priority":map[string]interface {}{}, "f:restartPolicy":map[string]interface {}{}, "f:schedulerName":map[string]interface {}{}, "f:securityContext":map[string]interface {}{}, "f:terminationGracePeriodSeconds":map[string]interface {}{}, "f:containers":map[string]interface {}{"k:{\"name\":\"kubernetes-serve-hostname\"}":map[string]interface {}{"f:terminationMessagePath":map[string]interface {}{}, "f:terminationMessagePolicy":map[string]interface {}{}, ".":map[string]interface {}{}, "f:image":map[string]interface {}{}, "f:imagePullPolicy":map[string]interface {}{}, "f:name":map[string]interface {}{}, "f:resources":map[string]interface {}{".":map[string]interface {}{}, "f:limits":map[string]interface {}{".":map[string]interface {}{}, "f:cpu":map[string]interface {}{}, "f:memory":map[string]interface {}{}}, "f:requests":map[string]interface {}{".":map[string]interface {}{}, "f:cpu":map[string]interface {}{}, "f:memory":map[string]interface {}{}}}}}, "f:dnsPolicy":map[string]interface {}{}}}, "manager":"kubectl"}}}, "spec":map[string]interface {}{"priority":0, "enableServiceLinks":true, "containers":[]interface {}{map[string]interface {}{"name":"kubernetes-serve-hostname", "image":"k8s.gcr.io/serve_hostname", "resources":map[string]interface {}{"limits":map[string]interface {}{"cpu":"1", "memory":"512Mi"}, "requests":map[string]interface {}{"cpu":"1", "memory":"512Mi"}}, "terminationMessagePath":"/dev/termination-log", "terminationMessagePolicy":"File", "imagePullPolicy":"Always"}}, "restartPolicy":"Always", "terminationGracePeriodSeconds":30, "dnsPolicy":"ClusterFirst", "securityContext":map[string]interface {}{}, "schedulerName":"default-scheduler"}, "status":map[string]interface {}{"phase":"Pending", "qosClass":"Guaranteed"}}
I0315 16:32:04.478] has:missing is not found
I0315 16:32:04.563] Successful
I0315 16:32:04.564] message:Error executing template: template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing". Printing more information for debugging the template:
I0315 16:32:04.564] 	template was:
I0315 16:32:04.564] 		{{.missing}}
I0315 16:32:04.564] 	raw data was:
I0315 16:32:04.565] 		{"apiVersion":"v1","kind":"Pod","metadata":{"creationTimestamp":"2019-03-15T16:32:03Z","labels":{"name":"valid-pod"},"managedFields":[{"apiVersion":"v1","fields":{"f:metadata":{"f:labels":{".":{},"f:name":{}}},"f:spec":{"f:containers":{"k:{\"name\":\"kubernetes-serve-hostname\"}":{".":{},"f:image":{},"f:imagePullPolicy":{},"f:name":{},"f:resources":{".":{},"f:limits":{".":{},"f:cpu":{},"f:memory":{}},"f:requests":{".":{},"f:cpu":{},"f:memory":{}}},"f:terminationMessagePath":{},"f:terminationMessagePolicy":{}}},"f:dnsPolicy":{},"f:enableServiceLinks":{},"f:priority":{},"f:restartPolicy":{},"f:schedulerName":{},"f:securityContext":{},"f:terminationGracePeriodSeconds":{}}},"manager":"kubectl","operation":"Update","time":"2019-03-15T16:32:03Z"}],"name":"valid-pod","namespace":"namespace-1552667523-23877","resourceVersion":"713","selfLink":"/api/v1/namespaces/namespace-1552667523-23877/pods/valid-pod","uid":"ddb26355-473f-11e9-a548-0242ac110002"},"spec":{"containers":[{"image":"k8s.gcr.io/serve_hostname","imagePullPolicy":"Always","name":"kubernetes-serve-hostname","resources":{"limits":{"cpu":"1","memory":"512Mi"},"requests":{"cpu":"1","memory":"512Mi"}},"terminationMessagePath":"/dev/termination-log","terminationMessagePolicy":"File"}],"dnsPolicy":"ClusterFirst","enableServiceLinks":true,"priority":0,"restartPolicy":"Always","schedulerName":"default-scheduler","securityContext":{},"terminationGracePeriodSeconds":30},"status":{"phase":"Pending","qosClass":"Guaranteed"}}
I0315 16:32:04.565] 	object given to template engine was:
I0315 16:32:04.566] 		map[apiVersion:v1 kind:Pod metadata:map[resourceVersion:713 selfLink:/api/v1/namespaces/namespace-1552667523-23877/pods/valid-pod uid:ddb26355-473f-11e9-a548-0242ac110002 creationTimestamp:2019-03-15T16:32:03Z labels:map[name:valid-pod] managedFields:[map[apiVersion:v1 fields:map[f:metadata:map[f:labels:map[.:map[] f:name:map[]]] f:spec:map[f:securityContext:map[] f:terminationGracePeriodSeconds:map[] f:containers:map[k:{"name":"kubernetes-serve-hostname"}:map[.:map[] f:image:map[] f:imagePullPolicy:map[] f:name:map[] f:resources:map[.:map[] f:limits:map[.:map[] f:cpu:map[] f:memory:map[]] f:requests:map[f:memory:map[] .:map[] f:cpu:map[]]] f:terminationMessagePath:map[] f:terminationMessagePolicy:map[]]] f:dnsPolicy:map[] f:enableServiceLinks:map[] f:priority:map[] f:restartPolicy:map[] f:schedulerName:map[]]] manager:kubectl operation:Update time:2019-03-15T16:32:03Z]] name:valid-pod namespace:namespace-1552667523-23877] spec:map[dnsPolicy:ClusterFirst enableServiceLinks:true priority:0 restartPolicy:Always schedulerName:default-scheduler securityContext:map[] terminationGracePeriodSeconds:30 containers:[map[image:k8s.gcr.io/serve_hostname imagePullPolicy:Always name:kubernetes-serve-hostname resources:map[limits:map[memory:512Mi cpu:1] requests:map[cpu:1 memory:512Mi]] terminationMessagePath:/dev/termination-log terminationMessagePolicy:File]]] status:map[phase:Pending qosClass:Guaranteed]]
I0315 16:32:04.566] has:map has no entry for key "missing"
W0315 16:32:04.667] I0315 16:32:04.088409   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:32:04.667] I0315 16:32:04.088629   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 16:32:04.668] error: error executing template "{{.missing}}": template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing"
W0315 16:32:05.089] I0315 16:32:05.088912   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:32:05.090] I0315 16:32:05.089134   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 16:32:05.651] E0315 16:32:05.650505   70236 streamwatcher.go:109] Unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)
I0315 16:32:05.752] Successful
I0315 16:32:05.752] message:NAME        READY   STATUS    RESTARTS   AGE
I0315 16:32:05.752] valid-pod   0/1     Pending   0          1s
... skipping 158 lines ...
I0315 16:32:07.945]   terminationGracePeriodSeconds: 30
I0315 16:32:07.945] status:
I0315 16:32:07.945]   phase: Pending
I0315 16:32:07.945]   qosClass: Guaranteed
I0315 16:32:07.945] has:name: valid-pod
I0315 16:32:07.947] Successful
I0315 16:32:07.947] message:Error from server (NotFound): pods "invalid-pod" not found
I0315 16:32:07.948] has:"invalid-pod" not found
I0315 16:32:08.054] pod "valid-pod" deleted
I0315 16:32:08.154] get.sh:193: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 16:32:08.353] pod/redis-master created
I0315 16:32:08.360] pod/valid-pod created
I0315 16:32:08.458] Successful
... skipping 299 lines ...
I0315 16:32:16.963] Running command: run_create_secret_tests
I0315 16:32:16.988] 
I0315 16:32:16.990] +++ Running case: test-cmd.run_create_secret_tests 
I0315 16:32:16.992] +++ working dir: /go/src/k8s.io/kubernetes
I0315 16:32:16.995] +++ command: run_create_secret_tests
I0315 16:32:17.090] Successful
I0315 16:32:17.091] message:Error from server (NotFound): secrets "mysecret" not found
I0315 16:32:17.091] has:secrets "mysecret" not found
W0315 16:32:17.192] I0315 16:32:17.098792   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:32:17.192] I0315 16:32:17.099027   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 16:32:17.293] Successful
I0315 16:32:17.293] message:Error from server (NotFound): secrets "mysecret" not found
I0315 16:32:17.293] has:secrets "mysecret" not found
I0315 16:32:17.293] Successful
I0315 16:32:17.293] message:user-specified
I0315 16:32:17.293] has:user-specified
I0315 16:32:17.345] Successful
I0315 16:32:17.424] {"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-create-cm","namespace":"default","selfLink":"/api/v1/namespaces/default/configmaps/tester-create-cm","uid":"e5b9f8dd-473f-11e9-a548-0242ac110002","resourceVersion":"823","creationTimestamp":"2019-03-15T16:32:17Z"}}
... skipping 184 lines ...
I0315 16:32:24.584] has:Timeout exceeded while reading body
I0315 16:32:24.677] Successful
I0315 16:32:24.677] message:NAME        READY   STATUS    RESTARTS   AGE
I0315 16:32:24.678] valid-pod   0/1     Pending   0          1s
I0315 16:32:24.678] has:valid-pod
I0315 16:32:24.755] Successful
I0315 16:32:24.756] message:error: Invalid timeout value. Timeout must be a single integer in seconds, or an integer followed by a corresponding time unit (e.g. 1s | 2m | 3h)
I0315 16:32:24.756] has:Invalid timeout value
I0315 16:32:24.844] pod "valid-pod" deleted
I0315 16:32:24.868] +++ exit code: 0
W0315 16:32:25.104] I0315 16:32:25.103065   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:32:25.104] I0315 16:32:25.103540   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 16:32:26.104] I0315 16:32:26.103908   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
... skipping 249 lines ...
W0315 16:32:31.583] I0315 16:32:30.106337   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:32:31.583] I0315 16:32:30.106350   55620 clientconn.go:557] scheme "" not registered, fallback to default scheme
W0315 16:32:31.584] I0315 16:32:30.106424   55620 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
W0315 16:32:31.584] I0315 16:32:30.106448   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 16:32:31.584] I0315 16:32:30.106478   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:31.584] I0315 16:32:30.114513   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:31.585] E0315 16:32:30.116051   58922 resource_quota_controller.go:437] failed to sync resource monitors: [couldn't start monitor for resource "company.com/v1, Resource=bars": unable to monitor quota for resource "company.com/v1, Resource=bars", couldn't start monitor for resource "company.com/v1, Resource=foos": unable to monitor quota for resource "company.com/v1, Resource=foos", couldn't start monitor for resource "company.com/v1, Resource=validfoos": unable to monitor quota for resource "company.com/v1, Resource=validfoos", couldn't start monitor for resource "extensions/v1beta1, Resource=networkpolicies": unable to monitor quota for resource "extensions/v1beta1, Resource=networkpolicies", couldn't start monitor for resource "mygroup.example.com/v1alpha1, Resource=resources": unable to monitor quota for resource "mygroup.example.com/v1alpha1, Resource=resources"]
W0315 16:32:31.585] I0315 16:32:30.205234   58922 controller_utils.go:1034] Caches are synced for garbage collector controller
W0315 16:32:31.586] I0315 16:32:31.106721   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:32:31.586] I0315 16:32:31.107055   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 16:32:31.686] foo.company.com/test patched
I0315 16:32:31.687] crd.sh:237: Successful get foos/test {{.patched}}: value1
I0315 16:32:31.764] foo.company.com/test patched
I0315 16:32:31.856] crd.sh:239: Successful get foos/test {{.patched}}: value2
I0315 16:32:31.953] foo.company.com/test patched
I0315 16:32:32.060] crd.sh:241: Successful get foos/test {{.patched}}: <no value>
I0315 16:32:32.250] +++ [0315 16:32:32] "kubectl patch --local" returns error as expected for CustomResource: error: cannot apply strategic merge patch for company.com/v1, Kind=Foo locally, try --type merge
I0315 16:32:32.323] {
I0315 16:32:32.323]     "apiVersion": "company.com/v1",
I0315 16:32:32.324]     "kind": "Foo",
I0315 16:32:32.324]     "metadata": {
I0315 16:32:32.324]         "annotations": {
I0315 16:32:32.324]             "kubernetes.io/change-cause": "kubectl patch foos/test --server=http://127.0.0.1:8080 --match-server-version=true --patch={\"patched\":null} --type=merge --record=true"
... skipping 336 lines ...
W0315 16:32:45.115] I0315 16:32:45.114702   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:32:45.116] I0315 16:32:45.115294   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 16:32:46.116] I0315 16:32:46.115560   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:32:46.117] I0315 16:32:46.115886   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 16:32:46.914] crd.sh:459: Successful get bars {{len .items}}: 0
I0315 16:32:47.073] customresourcedefinition.apiextensions.k8s.io "foos.company.com" deleted
W0315 16:32:47.174] Error from server (NotFound): namespaces "non-native-resources" not found
W0315 16:32:47.174] I0315 16:32:47.119667   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:32:47.175] I0315 16:32:47.119908   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 16:32:47.275] customresourcedefinition.apiextensions.k8s.io "bars.company.com" deleted
I0315 16:32:47.313] customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
I0315 16:32:47.416] customresourcedefinition.apiextensions.k8s.io "validfoos.company.com" deleted
I0315 16:32:47.451] +++ exit code: 0
... skipping 11 lines ...
I0315 16:32:48.709] +++ [0315 16:32:48] Testing cmd with image
I0315 16:32:48.801] Successful
I0315 16:32:48.801] message:deployment.apps/test1 created
I0315 16:32:48.802] has:deployment.apps/test1 created
I0315 16:32:48.885] deployment.extensions "test1" deleted
I0315 16:32:48.965] Successful
I0315 16:32:48.965] message:error: Invalid image name "InvalidImageName": invalid reference format
I0315 16:32:48.965] has:error: Invalid image name "InvalidImageName": invalid reference format
I0315 16:32:48.978] +++ exit code: 0
W0315 16:32:49.079] kubectl run --generator=deployment/apps.v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
W0315 16:32:49.079] I0315 16:32:48.790674   58922 event.go:209] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1552667568-9876", Name:"test1", UID:"f86b50a8-473f-11e9-a548-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"975", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test1-848d5d4b47 to 1
W0315 16:32:49.080] I0315 16:32:48.798022   58922 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552667568-9876", Name:"test1-848d5d4b47", UID:"f86c333c-473f-11e9-a548-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"976", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test1-848d5d4b47-rn42q
W0315 16:32:49.121] I0315 16:32:49.120700   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:32:49.121] I0315 16:32:49.120971   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
... skipping 5 lines ...
W0315 16:32:50.132] I0315 16:32:50.121870   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 16:32:50.233] generic-resources.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 16:32:50.455] generic-resources.sh:206: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 16:32:50.458] Successful
I0315 16:32:50.458] message:pod/busybox0 created
I0315 16:32:50.458] pod/busybox1 created
I0315 16:32:50.459] error: error validating "hack/testdata/recursive/pod/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0315 16:32:50.459] has:error validating data: kind not set
I0315 16:32:50.561] generic-resources.sh:211: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 16:32:50.758] generic-resources.sh:219: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: busybox:busybox:
I0315 16:32:50.760] Successful
I0315 16:32:50.761] message:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0315 16:32:50.761] has:Object 'Kind' is missing
I0315 16:32:50.885] generic-resources.sh:226: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 16:32:51.242] generic-resources.sh:230: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
I0315 16:32:51.245] Successful
I0315 16:32:51.246] message:pod/busybox0 replaced
I0315 16:32:51.246] pod/busybox1 replaced
I0315 16:32:51.249] error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0315 16:32:51.250] has:error validating data: kind not set
W0315 16:32:51.350] I0315 16:32:51.122216   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:32:51.351] I0315 16:32:51.122423   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 16:32:51.451] generic-resources.sh:235: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 16:32:51.495] Successful
I0315 16:32:51.496] message:Name:               busybox0
I0315 16:32:51.496] Namespace:          namespace-1552667569-18015
... skipping 161 lines ...
I0315 16:32:51.514] has:Object 'Kind' is missing
I0315 16:32:51.627] generic-resources.sh:245: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 16:32:51.843] generic-resources.sh:249: Successful get pods {{range.items}}{{.metadata.annotations.annotatekey}}:{{end}}: annotatevalue:annotatevalue:
I0315 16:32:51.846] Successful
I0315 16:32:51.846] message:pod/busybox0 annotated
I0315 16:32:51.846] pod/busybox1 annotated
I0315 16:32:51.847] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0315 16:32:51.847] has:Object 'Kind' is missing
I0315 16:32:51.946] generic-resources.sh:254: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 16:32:52.304] generic-resources.sh:258: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
I0315 16:32:52.306] Successful
I0315 16:32:52.306] message:Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
I0315 16:32:52.306] pod/busybox0 configured
I0315 16:32:52.306] Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
I0315 16:32:52.307] pod/busybox1 configured
I0315 16:32:52.307] error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0315 16:32:52.307] has:error validating data: kind not set
I0315 16:32:52.400] generic-resources.sh:264: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 16:32:52.609] deployment.apps/nginx created
W0315 16:32:52.710] I0315 16:32:51.820976   58922 namespace_controller.go:171] Namespace has been deleted non-native-resources
W0315 16:32:52.711] I0315 16:32:52.122708   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:32:52.711] I0315 16:32:52.122979   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 16:32:52.711] I0315 16:32:52.615234   58922 event.go:209] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1552667569-18015", Name:"nginx", UID:"fab30296-473f-11e9-a548-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1001", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-5f7cff5b56 to 3
... skipping 49 lines ...
I0315 16:32:53.086] deployment.extensions "nginx" deleted
I0315 16:32:53.188] generic-resources.sh:280: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 16:32:53.366] generic-resources.sh:284: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 16:32:53.368] Successful
I0315 16:32:53.368] message:kubectl convert is DEPRECATED and will be removed in a future version.
I0315 16:32:53.368] In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
I0315 16:32:53.368] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0315 16:32:53.369] has:Object 'Kind' is missing
W0315 16:32:53.469] kubectl convert is DEPRECATED and will be removed in a future version.
W0315 16:32:53.470] In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
W0315 16:32:53.470] I0315 16:32:53.123200   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:32:53.470] I0315 16:32:53.123429   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 16:32:53.570] generic-resources.sh:289: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 16:32:53.571] Successful
I0315 16:32:53.571] message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0315 16:32:53.571] has:busybox0:busybox1:
I0315 16:32:53.572] Successful
I0315 16:32:53.572] message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0315 16:32:53.572] has:Object 'Kind' is missing
I0315 16:32:53.674] generic-resources.sh:298: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 16:32:53.798] pod/busybox0 labeled pod/busybox1 labeled error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0315 16:32:53.906] generic-resources.sh:303: Successful get pods {{range.items}}{{.metadata.labels.mylabel}}:{{end}}: myvalue:myvalue:
I0315 16:32:53.908] Successful
I0315 16:32:53.908] message:pod/busybox0 labeled
I0315 16:32:53.909] pod/busybox1 labeled
I0315 16:32:53.909] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0315 16:32:53.909] has:Object 'Kind' is missing
I0315 16:32:54.019] generic-resources.sh:308: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 16:32:54.126] pod/busybox0 patched pod/busybox1 patched error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
W0315 16:32:54.226] I0315 16:32:54.123946   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:32:54.227] I0315 16:32:54.124148   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 16:32:54.328] generic-resources.sh:313: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: prom/busybox:prom/busybox:
I0315 16:32:54.328] Successful
I0315 16:32:54.328] message:pod/busybox0 patched
I0315 16:32:54.328] pod/busybox1 patched
I0315 16:32:54.329] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0315 16:32:54.329] has:Object 'Kind' is missing
I0315 16:32:54.372] generic-resources.sh:318: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 16:32:54.562] generic-resources.sh:322: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 16:32:54.564] Successful
I0315 16:32:54.565] message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
I0315 16:32:54.565] pod "busybox0" force deleted
I0315 16:32:54.565] pod "busybox1" force deleted
I0315 16:32:54.565] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0315 16:32:54.565] has:Object 'Kind' is missing
I0315 16:32:54.662] generic-resources.sh:327: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 16:32:54.849] replicationcontroller/busybox0 created
I0315 16:32:54.855] replicationcontroller/busybox1 created
W0315 16:32:54.955] error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
W0315 16:32:54.956] I0315 16:32:54.854723   58922 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552667569-18015", Name:"busybox0", UID:"fc08c41f-473f-11e9-a548-0242ac110002", APIVersion:"v1", ResourceVersion:"1032", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-zlm8h
W0315 16:32:54.956] I0315 16:32:54.861185   58922 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552667569-18015", Name:"busybox1", UID:"fc09b242-473f-11e9-a548-0242ac110002", APIVersion:"v1", ResourceVersion:"1034", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-8qbdf
I0315 16:32:55.057] generic-resources.sh:331: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 16:32:55.069] generic-resources.sh:336: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 16:32:55.181] generic-resources.sh:337: Successful get rc busybox0 {{.spec.replicas}}: 1
I0315 16:32:55.295] generic-resources.sh:338: Successful get rc busybox1 {{.spec.replicas}}: 1
I0315 16:32:56.024] generic-resources.sh:343: Successful get hpa busybox0 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
I0315 16:32:56.114] generic-resources.sh:344: Successful get hpa busybox1 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
I0315 16:32:56.116] Successful
I0315 16:32:56.116] message:horizontalpodautoscaler.autoscaling/busybox0 autoscaled
I0315 16:32:56.117] horizontalpodautoscaler.autoscaling/busybox1 autoscaled
I0315 16:32:56.117] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0315 16:32:56.117] has:Object 'Kind' is missing
I0315 16:32:56.194] horizontalpodautoscaler.autoscaling "busybox0" deleted
W0315 16:32:56.295] I0315 16:32:55.124404   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:32:56.295] I0315 16:32:55.124625   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 16:32:56.296] I0315 16:32:56.124911   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:32:56.296] I0315 16:32:56.125132   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
... skipping 3 lines ...
I0315 16:32:56.589] generic-resources.sh:354: Successful get rc busybox1 {{.spec.replicas}}: 1
I0315 16:32:56.777] generic-resources.sh:358: Successful get service busybox0 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
I0315 16:32:56.875] generic-resources.sh:359: Successful get service busybox1 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
I0315 16:32:56.878] Successful
I0315 16:32:56.879] message:service/busybox0 exposed
I0315 16:32:56.879] service/busybox1 exposed
I0315 16:32:56.879] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0315 16:32:56.879] has:Object 'Kind' is missing
I0315 16:32:56.976] generic-resources.sh:365: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 16:32:57.078] generic-resources.sh:366: Successful get rc busybox0 {{.spec.replicas}}: 1
I0315 16:32:57.177] generic-resources.sh:367: Successful get rc busybox1 {{.spec.replicas}}: 1
I0315 16:32:57.390] generic-resources.sh:371: Successful get rc busybox0 {{.spec.replicas}}: 2
I0315 16:32:57.478] generic-resources.sh:372: Successful get rc busybox1 {{.spec.replicas}}: 2
I0315 16:32:57.480] Successful
I0315 16:32:57.480] message:replicationcontroller/busybox0 scaled
I0315 16:32:57.481] replicationcontroller/busybox1 scaled
I0315 16:32:57.481] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0315 16:32:57.481] has:Object 'Kind' is missing
I0315 16:32:57.578] generic-resources.sh:377: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 16:32:57.766] Waiting for Get pods {{range.items}}{{.metadata.name}}:{{end}} : expected: , got: busybox0-4sfwq:busybox0-zlm8h:busybox1-8qbdf:busybox1-xd2zw:
I0315 16:32:57.768] 
I0315 16:32:57.772] generic-resources.sh:381: FAIL!
I0315 16:32:57.773] Get pods {{range.items}}{{.metadata.name}}:{{end}}
I0315 16:32:57.773]   Expected: 
I0315 16:32:57.773]   Got:      busybox0-4sfwq:busybox0-zlm8h:busybox1-8qbdf:busybox1-xd2zw:
I0315 16:32:57.773] 
I0315 16:32:57.773] 51 /go/src/k8s.io/kubernetes/hack/lib/test.sh
I0315 16:32:57.774] 
I0315 16:32:57.856] junit report dir: /workspace/artifacts
I0315 16:32:57.859] +++ [0315 16:32:57] Clean up complete
I0315 16:32:57.860] Makefile:298: recipe for target 'test-cmd' failed
W0315 16:32:57.961] I0315 16:32:57.125567   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 16:32:57.962] I0315 16:32:57.125893   55620 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 16:32:57.962] I0315 16:32:57.279507   58922 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552667569-18015", Name:"busybox0", UID:"fc08c41f-473f-11e9-a548-0242ac110002", APIVersion:"v1", ResourceVersion:"1054", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-4sfwq
W0315 16:32:57.962] I0315 16:32:57.294130   58922 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552667569-18015", Name:"busybox1", UID:"fc09b242-473f-11e9-a548-0242ac110002", APIVersion:"v1", ResourceVersion:"1058", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-xd2zw
W0315 16:32:57.962] !!! [0315 16:32:57] Call tree:
W0315 16:32:57.962] !!! [0315 16:32:57]  1: /go/src/k8s.io/kubernetes/test/cmd/../../test/cmd/generic-resources.sh:381 kube::test::get_object_assert(...)
W0315 16:32:57.963] !!! [0315 16:32:57]  2: /go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:558 run_recursive_resources_tests(...)
W0315 16:32:57.963] !!! [0315 16:32:57]  3: hack/make-rules/test-cmd.sh:123 runTests(...)
W0315 16:32:57.963] I0315 16:32:57.795956   55620 crdregistration_controller.go:143] Shutting down crd-autoregister controller
W0315 16:32:57.963] I0315 16:32:57.796106   55620 controller.go:87] Shutting down OpenAPI AggregationController
W0315 16:32:57.963] W0315 16:32:57.800761   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.963] I0315 16:32:57.796131   55620 available_controller.go:332] Shutting down AvailableConditionController
W0315 16:32:57.964] I0315 16:32:57.800832   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.964] I0315 16:32:57.800866   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.964] I0315 16:32:57.796147   55620 customresource_discovery_controller.go:219] Shutting down DiscoveryController
W0315 16:32:57.964] I0315 16:32:57.800898   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.964] I0315 16:32:57.800907   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
... skipping 22 lines ...
W0315 16:32:57.969] I0315 16:32:57.797645   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.969] I0315 16:32:57.797744   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.969] I0315 16:32:57.797747   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.970] I0315 16:32:57.797773   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.970] I0315 16:32:57.797818   55620 secure_serving.go:160] Stopped listening on 127.0.0.1:8080
W0315 16:32:57.970] I0315 16:32:57.797889   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.971] W0315 16:32:57.797904   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.971] W0315 16:32:57.797946   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.971] W0315 16:32:57.797954   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.971] W0315 16:32:57.801380   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.972] W0315 16:32:57.798004   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.972] W0315 16:32:57.801394   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.972] W0315 16:32:57.798158   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.973] W0315 16:32:57.801414   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.973] W0315 16:32:57.801419   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.973] W0315 16:32:57.801427   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.973] W0315 16:32:57.801435   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.974] W0315 16:32:57.801456   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.974] W0315 16:32:57.801467   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.974] W0315 16:32:57.801485   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.974] I0315 16:32:57.798103   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.975] W0315 16:32:57.801498   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.975] I0315 16:32:57.801506   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.975] W0315 16:32:57.801507   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.975] W0315 16:32:57.801512   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.976] W0315 16:32:57.798282   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.976] W0315 16:32:57.801530   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.976] W0315 16:32:57.801392   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.977] W0315 16:32:57.801547   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.977] W0315 16:32:57.801499   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.977] W0315 16:32:57.798712   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.977] W0315 16:32:57.801469   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.978] I0315 16:32:57.798975   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.978] W0315 16:32:57.801589   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.978] I0315 16:32:57.799003   55620 picker_wrapper.go:218] blockingPicker: the picked transport is not ready, loop back to repick
W0315 16:32:57.978] I0315 16:32:57.798941   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.979] I0315 16:32:57.801599   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.979] I0315 16:32:57.799044   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.979] W0315 16:32:57.798969   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.979] W0315 16:32:57.801620   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.980] W0315 16:32:57.801627   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.980] I0315 16:32:57.799074   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.980] I0315 16:32:57.799100   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.980] I0315 16:32:57.801628   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.980] I0315 16:32:57.801644   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.981] I0315 16:32:57.799212   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.981] I0315 16:32:57.799136   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.981] I0315 16:32:57.799199   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.981] W0315 16:32:57.801659   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.982] I0315 16:32:57.801663   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.982] W0315 16:32:57.801458   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.982] I0315 16:32:57.801665   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.982] W0315 16:32:57.801673   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.982] I0315 16:32:57.799323   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.983] I0315 16:32:57.801661   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.983] W0315 16:32:57.801736   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.983] I0315 16:32:57.799346   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.983] W0315 16:32:57.801746   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.984] I0315 16:32:57.801755   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.984] W0315 16:32:57.801756   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.984] W0315 16:32:57.801765   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.984] I0315 16:32:57.799378   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.984] I0315 16:32:57.801790   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.985] I0315 16:32:57.799385   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.985] W0315 16:32:57.801800   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.985] W0315 16:32:57.801802   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.985] W0315 16:32:57.801807   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.986] I0315 16:32:57.801809   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.986] I0315 16:32:57.799412   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.986] I0315 16:32:57.799414   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.986] I0315 16:32:57.801825   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.986] I0315 16:32:57.801830   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.987] I0315 16:32:57.799400   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.987] I0315 16:32:57.799476   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.987] I0315 16:32:57.799480   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.987] W0315 16:32:57.799542   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.988] I0315 16:32:57.799733   55620 picker_wrapper.go:218] blockingPicker: the picked transport is not ready, loop back to repick
W0315 16:32:57.988] I0315 16:32:57.800127   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.988] I0315 16:32:57.800171   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.988] I0315 16:32:57.800200   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.988] I0315 16:32:57.800235   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.989] I0315 16:32:57.800262   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.989] I0315 16:32:57.800289   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.989] I0315 16:32:57.800292   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.989] I0315 16:32:57.800319   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.990] I0315 16:32:57.800346   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.990] I0315 16:32:57.800374   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.990] W0315 16:32:57.800384   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.990] I0315 16:32:57.800411   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.991] I0315 16:32:57.800441   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.991] I0315 16:32:57.800454   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.991] I0315 16:32:57.800471   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.991] I0315 16:32:57.800474   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.991] I0315 16:32:57.800484   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
... skipping 7 lines ...
W0315 16:32:57.993] I0315 16:32:57.796166   55620 autoregister_controller.go:163] Shutting down autoregister controller
W0315 16:32:57.993] I0315 16:32:57.801114   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.993] I0315 16:32:57.801146   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.993] I0315 16:32:57.801160   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.994] I0315 16:32:57.801175   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.994] I0315 16:32:57.801198   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.994] W0315 16:32:57.801203   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.994] I0315 16:32:57.801211   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.994] I0315 16:32:57.801225   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.995] W0315 16:32:57.801236   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.995] I0315 16:32:57.801240   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.995] I0315 16:32:57.801254   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.995] W0315 16:32:57.801267   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.996] I0315 16:32:57.801267   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.996] I0315 16:32:57.801271   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.996] I0315 16:32:57.801275   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.996] I0315 16:32:57.801289   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.996] I0315 16:32:57.801303   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.997] I0315 16:32:57.801317   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.997] W0315 16:32:57.801317   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.997] W0315 16:32:57.801321   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.997] W0315 16:32:57.801321   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.998] I0315 16:32:57.801327   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.998] I0315 16:32:57.801331   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.998] W0315 16:32:57.801350   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.998] I0315 16:32:57.801351   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:57.998] W0315 16:32:57.801358   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.999] W0315 16:32:57.801358   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.999] W0315 16:32:57.801378   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:57.999] W0315 16:32:57.801469   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:58.000] W0315 16:32:57.798228   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:58.000] W0315 16:32:57.801542   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:58.000] W0315 16:32:57.801559   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:58.000] W0315 16:32:57.801566   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:58.001] W0315 16:32:57.801587   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:58.001] W0315 16:32:57.801594   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:58.001] I0315 16:32:57.799016   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.001] I0315 16:32:57.801611   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.002] W0315 16:32:57.798366   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:58.002] W0315 16:32:57.801642   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:58.002] I0315 16:32:57.801649   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.002] I0315 16:32:57.799266   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.002] I0315 16:32:57.799234   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.003] I0315 16:32:57.799178   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.003] I0315 16:32:57.799236   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.003] I0315 16:32:57.799294   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.003] W0315 16:32:57.801700   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:58.004] W0315 16:32:57.801709   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:58.004] W0315 16:32:57.801712   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:58.004] W0315 16:32:57.801721   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:58.004] W0315 16:32:57.801722   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:58.005] I0315 16:32:57.801726   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.005] I0315 16:32:57.799349   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.005] W0315 16:32:57.801802   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:58.005] I0315 16:32:57.799444   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.006] W0315 16:32:57.801859   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:58.006] W0315 16:32:57.801860   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:58.006] W0315 16:32:57.801868   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:58.006] I0315 16:32:57.799445   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.006] I0315 16:32:57.801889   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.007] W0315 16:32:57.801895   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:58.007] W0315 16:32:57.801903   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:58.007] I0315 16:32:57.801908   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.007] W0315 16:32:57.801913   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:58.008] W0315 16:32:57.801923   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:58.008] I0315 16:32:57.801925   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.008] I0315 16:32:57.801965   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.008] I0315 16:32:57.801980   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.009] I0315 16:32:57.801996   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.009] I0315 16:32:57.802007   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.009] I0315 16:32:57.802008   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.009] I0315 16:32:57.802022   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.009] I0315 16:32:57.802033   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.010] I0315 16:32:57.802047   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.010] W0315 16:32:57.802048   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:58.010] I0315 16:32:57.802060   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.010] I0315 16:32:57.802073   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.010] I0315 16:32:57.802087   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.011] I0315 16:32:57.802099   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.011] I0315 16:32:57.802120   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.011] I0315 16:32:57.802134   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
... skipping 4 lines ...
W0315 16:32:58.012] I0315 16:32:57.802231   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.012] I0315 16:32:57.802245   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.012] I0315 16:32:57.802260   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.013] I0315 16:32:57.802275   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.013] I0315 16:32:57.802288   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.013] I0315 16:32:57.802301   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.013] W0315 16:32:57.802323   55620 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 16:32:58.013] I0315 16:32:57.802340   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.014] I0315 16:32:57.802442   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.014] I0315 16:32:57.802459   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.014] I0315 16:32:57.802521   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.014] I0315 16:32:57.802683   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.014] I0315 16:32:57.802762   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
... skipping 3 lines ...
W0315 16:32:58.015] I0315 16:32:57.802839   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.015] I0315 16:32:57.802938   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.016] I0315 16:32:57.803060   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.016] I0315 16:32:57.803194   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.016] I0315 16:32:57.803314   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.016] I0315 16:32:57.803399   55620 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 16:32:58.016] make: *** [test-cmd] Error 1
W0315 16:34:05.770] Traceback (most recent call last):
W0315 16:34:05.770]   File "/workspace/./test-infra/jenkins/../scenarios/kubernetes_verify.py", line 178, in <module>
W0315 16:34:05.770]     ARGS.exclude_typecheck, ARGS.exclude_godep)
W0315 16:34:05.770]   File "/workspace/./test-infra/jenkins/../scenarios/kubernetes_verify.py", line 140, in main
W0315 16:34:05.770]     check(*cmd)
W0315 16:34:05.771]   File "/workspace/./test-infra/jenkins/../scenarios/kubernetes_verify.py", line 48, in check
W0315 16:34:05.771]     subprocess.check_call(cmd)
W0315 16:34:05.771]   File "/usr/lib/python2.7/subprocess.py", line 186, in check_call
W0315 16:34:05.819]     raise CalledProcessError(retcode, cmd)
W0315 16:34:05.819] subprocess.CalledProcessError: Command '('docker', 'run', '--rm=true', '--privileged=true', '-v', '/var/run/docker.sock:/var/run/docker.sock', '-v', '/etc/localtime:/etc/localtime:ro', '-v', '/workspace/k8s.io/kubernetes:/go/src/k8s.io/kubernetes', '-v', '/workspace/k8s.io/:/workspace/k8s.io/', '-v', '/workspace/_artifacts:/workspace/artifacts', '-e', 'KUBE_FORCE_VERIFY_CHECKS=y', '-e', 'KUBE_VERIFY_GIT_BRANCH=master', '-e', 'EXCLUDE_TYPECHECK=n', '-e', 'EXCLUDE_GODEP=n', '-e', 'REPO_DIR=/workspace/k8s.io/kubernetes', '--tmpfs', '/tmp:exec,mode=1777', 'gcr.io/k8s-testimages/kubekins-test:1.13-v20190125-cc5d6ecff3', 'bash', '-c', 'cd kubernetes && ./hack/jenkins/test-dockerized.sh')' returned non-zero exit status 2
E0315 16:34:05.826] Command failed
I0315 16:34:05.827] process 507 exited with code 1 after 12.5m
E0315 16:34:05.827] FAIL: ci-kubernetes-integration-master
I0315 16:34:05.827] Call:  gcloud auth activate-service-account --key-file=/etc/service-account/service-account.json
W0315 16:34:06.606] Activated service account credentials for: [pr-kubekins@kubernetes-jenkins-pull.iam.gserviceaccount.com]
I0315 16:34:06.665] process 76851 exited with code 0 after 0.0m
I0315 16:34:06.665] Call:  gcloud config get-value account
I0315 16:34:07.055] process 76863 exited with code 0 after 0.0m
I0315 16:34:07.056] Will upload results to gs://kubernetes-jenkins/logs using pr-kubekins@kubernetes-jenkins-pull.iam.gserviceaccount.com
I0315 16:34:07.056] Upload result and artifacts...
I0315 16:34:07.056] Gubernator results at https://gubernator.k8s.io/build/kubernetes-jenkins/logs/ci-kubernetes-integration-master/9500
I0315 16:34:07.057] Call:  gsutil ls gs://kubernetes-jenkins/logs/ci-kubernetes-integration-master/9500/artifacts
W0315 16:34:08.611] CommandException: One or more URLs matched no objects.
E0315 16:34:08.802] Command failed
I0315 16:34:08.803] process 76875 exited with code 1 after 0.0m
W0315 16:34:08.803] Remote dir gs://kubernetes-jenkins/logs/ci-kubernetes-integration-master/9500/artifacts not exist yet
I0315 16:34:08.803] Call:  gsutil -m -q -o GSUtil:use_magicfile=True cp -r -c -z log,txt,xml /workspace/_artifacts gs://kubernetes-jenkins/logs/ci-kubernetes-integration-master/9500/artifacts
I0315 16:34:16.232] process 77017 exited with code 0 after 0.1m
W0315 16:34:16.233] metadata path /workspace/_artifacts/metadata.json does not exist
W0315 16:34:16.233] metadata not found or invalid, init with empty metadata
... skipping 15 lines ...