Result | FAILURE |
Tests | 1 failed / 133 succeeded |
Started | |
Elapsed | 15m49s |
Revision | master |
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=test\-cmd\srun\_kubectl\_request\_timeout\_tests$'
!!! [0113 12:39:53] Call tree: !!! [0113 12:39:53] 1: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:47 run_kubectl_request_timeout_tests(...) !!! [0113 12:39:53] 2: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:112 eVal(...) !!! [0113 12:39:53] 3: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:131 juLog(...) !!! [0113 12:39:53] 4: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:576 record_command(...) !!! [0113 12:39:53] 5: hack/make-rules/test-cmd.sh:184 runTests(...)from junit_test-cmd.xml
!!! [0113 12:39:53] Call tree: !!! [0113 12:39:53] 1: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:47 run_kubectl_request_timeout_tests(...) !!! [0113 12:39:53] 2: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:112 eVal(...) !!! [0113 12:39:53] 3: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:131 juLog(...) !!! [0113 12:39:53] 4: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:576 record_command(...) !!! [0113 12:39:53] 5: hack/make-rules/test-cmd.sh:184 runTests(...)
Filter through log files | View test history on testgrid
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdCompletion
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdCompletion/shell_not_expected
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdCompletion/unsupported_shell_type
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdInitAPIPort
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdInitAPIPort/accept_a_valid_port_number
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdInitAPIPort/fail_on_negative_port_number
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdInitAPIPort/fail_on_non-string_port
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdInitAPIPort/fail_on_too_large_port_number
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdInitCertPhaseCSR
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdInitCertPhaseCSR/fails_on_CSR
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdInitCertPhaseCSR/fails_on_all
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdInitCertPhaseCSR/generate_CSR
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdInitConfig
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdInitConfig/can't_load_old_component_config
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdInitConfig/can't_load_old_v1alpha1_config
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdInitConfig/can't_load_old_v1alpha2_config
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdInitConfig/can't_load_old_v1alpha3_config
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdInitConfig/can_load_current_component_config
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdInitConfig/can_load_v1beta1_config
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdInitConfig/can_load_v1beta2_config
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdInitConfig/don't_allow_mixed_arguments_v1beta1
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdInitConfig/don't_allow_mixed_arguments_v1beta2
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdInitConfig/fail_on_non_existing_path
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdInitFeatureGates
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdInitFeatureGates/feature_gate_IPv6DualStack=true
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdInitFeatureGates/feature_gate_PublicKeysECDSA=true
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdInitFeatureGates/no_feature_gates_passed
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdInitKubernetesVersion
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdInitKubernetesVersion/invalid_semantic_version_string_is_detected
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdInitKubernetesVersion/valid_version_is_accepted
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdInitToken
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdInitToken/invalid_token_non-lowercase
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdInitToken/invalid_token_size
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdInitToken/valid_token_is_accepted
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdJoinArgsMixed
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdJoinArgsMixed/discovery-token_and_config
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdJoinBadArgs
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdJoinBadArgs/discovery-token_and_discovery-file_can't_both_be_set
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdJoinBadArgs/discovery-token_or_discovery-file_must_be_set
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdJoinConfig
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdJoinConfig/config
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdJoinConfig/config_path
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdJoinDiscoveryFile
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdJoinDiscoveryFile/invalid_discovery_file
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdJoinDiscoveryFile/valid_discovery_file
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdJoinDiscoveryToken
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdJoinDiscoveryToken/valid_discovery_token
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdJoinDiscoveryToken/valid_discovery_token_url
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdJoinNodeName
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdJoinNodeName/valid_node_name
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdJoinTLSBootstrapToken
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdJoinTLSBootstrapToken/valid_bootstrap_token
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdJoinTLSBootstrapToken/valid_bootstrap_token_url
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdJoinToken
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdJoinToken/valid_token
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdJoinToken/valid_token_url
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdTokenDelete
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdTokenDelete/invalid_token
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdTokenDelete/no_token_provided
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdTokenGenerate
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdTokenGenerateTypoError
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdVersion
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdVersion/default_output
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdVersion/invalid_output_option
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdVersion/short_output
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdVersionOutputJsonOrYaml
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdVersionOutputJsonOrYaml/json_output
k8s.io/kubernetes/cmd/kubeadm/test/cmd TestCmdVersionOutputJsonOrYaml/yaml_output
test-cmd run_RESTMapper_evaluation_tests
test-cmd run_assert_categories_tests
test-cmd run_assert_short_name_tests
test-cmd run_authorization_tests
test-cmd run_certificates_tests
test-cmd run_client_config_tests
test-cmd run_cluster_management_tests
test-cmd run_clusterroles_tests
test-cmd run_configmap_tests
test-cmd run_crd_tests
test-cmd run_create_job_tests
test-cmd run_create_secret_tests
test-cmd run_daemonset_history_tests
test-cmd run_daemonset_tests
test-cmd run_deployment_tests
test-cmd run_exec_credentials_tests
test-cmd run_impersonation_tests
test-cmd run_job_tests
test-cmd run_kubectl_all_namespace_tests
test-cmd run_kubectl_apply_deployments_tests
test-cmd run_kubectl_apply_tests
test-cmd run_kubectl_config_set_cluster_tests
test-cmd run_kubectl_config_set_credentials_tests
test-cmd run_kubectl_config_set_tests
test-cmd run_kubectl_create_error_tests
test-cmd run_kubectl_create_filter_tests
test-cmd run_kubectl_create_kustomization_directory_tests
test-cmd run_kubectl_debug_node_tests
test-cmd run_kubectl_debug_pod_tests
test-cmd run_kubectl_delete_allnamespaces_tests
test-cmd run_kubectl_diff_same_names
test-cmd run_kubectl_diff_tests
test-cmd run_kubectl_exec_pod_tests
test-cmd run_kubectl_exec_resource_name_tests
test-cmd run_kubectl_explain_tests
test-cmd run_kubectl_get_tests
test-cmd run_kubectl_local_proxy_tests
test-cmd run_kubectl_run_tests
test-cmd run_kubectl_server_side_apply_tests
test-cmd run_kubectl_sort_by_tests
test-cmd run_kubectl_version_tests
test-cmd run_lists_tests
test-cmd run_multi_resources_tests
test-cmd run_namespace_tests
test-cmd run_nodes_tests
test-cmd run_persistent_volume_claims_tests
test-cmd run_persistent_volumes_tests
test-cmd run_plugins_tests
test-cmd run_pod_templates_tests
test-cmd run_pod_tests
test-cmd run_rc_tests
test-cmd run_resource_aliasing_tests
test-cmd run_retrieve_multiple_tests
test-cmd run_role_tests
test-cmd run_rs_tests
test-cmd run_save_config_tests
test-cmd run_secrets_test
test-cmd run_service_accounts_tests
test-cmd run_service_tests
test-cmd run_stateful_set_tests
test-cmd run_statefulset_history_tests
test-cmd run_storage_class_tests
test-cmd run_swagger_tests
test-cmd run_template_output_tests
test-cmd run_wait_tests
... skipping 61 lines ... Recording: record_command_canary Running command: record_command_canary +++ Running case: test-cmd.record_command_canary +++ working dir: /home/prow/go/src/k8s.io/kubernetes +++ command: record_command_canary /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh: line 155: bogus-expected-to-fail: command not found !!! [0113 12:32:34] Call tree: !!! [0113 12:32:34] 1: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:47 record_command_canary(...) !!! [0113 12:32:34] 2: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:112 eVal(...) !!! [0113 12:32:34] 3: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:131 juLog(...) !!! [0113 12:32:34] 4: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:159 record_command(...) !!! [0113 12:32:34] 5: hack/make-rules/test-cmd.sh:35 source(...) +++ exit code: 1 +++ error: 1 +++ [0113 12:32:35] Running kubeadm tests +++ [0113 12:32:39] Building go targets for linux/amd64: cmd/kubeadm +++ [0113 12:33:23] Running tests without code coverage {"Time":"2021-01-13T12:34:48.188953358Z","Action":"output","Package":"k8s.io/kubernetes/cmd/kubeadm/test/cmd","Output":"ok \tk8s.io/kubernetes/cmd/kubeadm/test/cmd\t49.189s\n"} ✓ cmd/kubeadm/test/cmd (49.193s) ... skipping 345 lines ... I0113 12:37:02.103422 54773 client.go:360] parsed scheme: "passthrough" I0113 12:37:02.103496 54773 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{http://127.0.0.1:2379 <nil> 0 <nil>}] <nil> <nil>} I0113 12:37:02.103510 54773 clientconn.go:948] ClientConn switching balancer to "pick_first" +++ [0113 12:37:05] Generate kubeconfig for controller-manager +++ [0113 12:37:05] Starting controller-manager I0113 12:37:06.447927 58414 serving.go:331] Generated self-signed cert in-memory W0113 12:37:07.026720 58414 authentication.go:410] failed to read in-cluster kubeconfig for delegated authentication: open /var/run/secrets/kubernetes.io/serviceaccount/token: no such file or directory W0113 12:37:07.026771 58414 authentication.go:307] No authentication-kubeconfig provided in order to lookup client-ca-file in configmap/extension-apiserver-authentication in kube-system, so client certificate authentication won't work. W0113 12:37:07.026778 58414 authentication.go:331] No authentication-kubeconfig provided in order to lookup requestheader-client-ca-file in configmap/extension-apiserver-authentication in kube-system, so request-header client certificate authentication won't work. W0113 12:37:07.026797 58414 authorization.go:208] failed to read in-cluster kubeconfig for delegated authorization: open /var/run/secrets/kubernetes.io/serviceaccount/token: no such file or directory W0113 12:37:07.026820 58414 authorization.go:176] No authorization-kubeconfig provided, so SubjectAccessReview of authorization tokens won't work. I0113 12:37:07.026842 58414 controllermanager.go:176] Version: v1.21.0-alpha.0.790+f6e04cd3ad1f3a I0113 12:37:07.028068 58414 secure_serving.go:197] Serving securely on [::]:10257 I0113 12:37:07.028153 58414 tlsconfig.go:240] Starting DynamicServingCertificateController I0113 12:37:07.028870 58414 deprecated_insecure_serving.go:53] Serving insecurely on [::]:10252 I0113 12:37:07.028938 58414 leaderelection.go:243] attempting to acquire leader lease kube-system/kube-controller-manager... ... skipping 10 lines ... I0113 12:37:07.507031 58414 controllermanager.go:554] Started "persistentvolume-expander" I0113 12:37:07.507149 58414 expand_controller.go:310] Starting expand controller I0113 12:37:07.507180 58414 shared_informer.go:240] Waiting for caches to sync for expand W0113 12:37:07.507365 58414 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage. Client Version: version.Info{Major:"1", Minor:"21+", GitVersion:"v1.21.0-alpha.0.790+f6e04cd3ad1f3a", GitCommit:"f6e04cd3ad1f3a1a00fd091e1524d2d27dc56bd7", GitTreeState:"clean", BuildDate:"2021-01-13T12:18:35Z", GoVersion:"go1.15.5", Compiler:"gc", Platform:"linux/amd64"} Server Version: version.Info{Major:"1", Minor:"21+", GitVersion:"v1.21.0-alpha.0.790+f6e04cd3ad1f3a", GitCommit:"f6e04cd3ad1f3a1a00fd091e1524d2d27dc56bd7", GitTreeState:"clean", BuildDate:"2021-01-13T12:18:35Z", GoVersion:"go1.15.5", Compiler:"gc", Platform:"linux/amd64"} The Service "kubernetes" is invalid: spec.clusterIPs: Invalid value: []string{"10.0.0.1"}: failed to allocated ip:10.0.0.1 with error:provided IP is already allocated W0113 12:37:07.959256 58414 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage. I0113 12:37:07.959341 58414 resource_quota_monitor.go:229] QuotaMonitor created object count evaluator for endpoints W0113 12:37:07.959376 58414 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage. I0113 12:37:07.959399 58414 resource_quota_monitor.go:229] QuotaMonitor created object count evaluator for statefulsets.apps W0113 12:37:07.959441 58414 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage. I0113 12:37:07.959495 58414 resource_quota_monitor.go:229] QuotaMonitor created object count evaluator for rolebindings.rbac.authorization.k8s.io ... skipping 100 lines ... I0113 12:37:07.967358 58414 certificate_controller.go:118] Starting certificate controller "csrapproving" I0113 12:37:07.967381 58414 shared_informer.go:240] Waiting for caches to sync for certificate-csrapproving I0113 12:37:07.967595 58414 controllermanager.go:554] Started "ttl" W0113 12:37:07.967629 58414 controllermanager.go:533] "tokencleaner" is disabled I0113 12:37:07.967793 58414 ttl_controller.go:121] Starting TTL controller I0113 12:37:07.967820 58414 shared_informer.go:240] Waiting for caches to sync for TTL E0113 12:37:07.968103 58414 core.go:91] Failed to start service controller: WARNING: no cloud provider provided, services of type LoadBalancer will fail W0113 12:37:07.968124 58414 controllermanager.go:546] Skipping "service" I0113 12:37:07.968135 58414 core.go:241] Will not configure cloud provider routes for allocate-node-cidrs: false, configure-cloud-routes: true. W0113 12:37:07.968144 58414 controllermanager.go:546] Skipping "route" I0113 12:37:07.968522 58414 controllermanager.go:554] Started "daemonset" I0113 12:37:07.968948 58414 daemon_controller.go:285] Starting daemon sets controller I0113 12:37:07.968966 58414 shared_informer.go:240] Waiting for caches to sync for daemon sets ... skipping 2 lines ... I0113 12:37:07.975585 58414 shared_informer.go:240] Waiting for caches to sync for namespace I0113 12:37:07.975984 58414 garbagecollector.go:142] Starting garbage collector controller I0113 12:37:07.976003 58414 shared_informer.go:240] Waiting for caches to sync for garbage collector I0113 12:37:07.976026 58414 graph_builder.go:289] GraphBuilder running I0113 12:37:07.976042 58414 controllermanager.go:554] Started "garbagecollector" I0113 12:37:07.976766 58414 node_lifecycle_controller.go:76] Sending events to api server E0113 12:37:07.976816 58414 core.go:231] failed to start cloud node lifecycle controller: no cloud provider provided W0113 12:37:07.976825 58414 controllermanager.go:546] Skipping "cloud-node-lifecycle" I0113 12:37:07.977356 58414 controllermanager.go:554] Started "endpoint" I0113 12:37:07.977958 58414 endpoints_controller.go:184] Starting endpoint controller I0113 12:37:07.978102 58414 shared_informer.go:240] Waiting for caches to sync for endpoint I0113 12:37:07.979077 58414 controllermanager.go:554] Started "replicaset" I0113 12:37:07.980580 58414 controllermanager.go:554] Started "csrcleaner" ... skipping 12 lines ... I0113 12:37:07.987435 58414 controllermanager.go:554] Started "root-ca-cert-publisher" I0113 12:37:07.987687 58414 publisher.go:98] Starting root CA certificate configmap publisher I0113 12:37:07.987707 58414 shared_informer.go:240] Waiting for caches to sync for crt configmap I0113 12:37:07.987996 58414 controllermanager.go:554] Started "endpointslice" I0113 12:37:07.988024 58414 endpointslice_controller.go:237] Starting endpoint slice controller I0113 12:37:07.988372 58414 shared_informer.go:240] Waiting for caches to sync for endpoint_slice W0113 12:37:08.006719 58414 actual_state_of_world.go:534] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="127.0.0.1" does not exist I0113 12:37:08.007587 58414 shared_informer.go:247] Caches are synced for expand I0113 12:37:08.061596 58414 shared_informer.go:247] Caches are synced for service account I0113 12:37:08.062280 58414 shared_informer.go:247] Caches are synced for HPA I0113 12:37:08.062791 58414 shared_informer.go:247] Caches are synced for PV protection NAME TYPE CLUSTER-IP EXTERNAL-IP PORT(S) AGE kubernetes ClusterIP 10.0.0.1 <none> 443/TCP 35s ... skipping 16 lines ... I0113 12:37:08.075767 58414 shared_informer.go:247] Caches are synced for namespace I0113 12:37:08.078420 58414 shared_informer.go:247] Caches are synced for endpoint I0113 12:37:08.083013 58414 shared_informer.go:247] Caches are synced for persistent volume I0113 12:37:08.083292 58414 shared_informer.go:247] Caches are synced for ClusterRoleAggregator I0113 12:37:08.087767 58414 shared_informer.go:247] Caches are synced for crt configmap I0113 12:37:08.088651 58414 shared_informer.go:247] Caches are synced for endpoint_slice E0113 12:37:08.091310 58414 clusterroleaggregation_controller.go:181] admin failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "admin": the object has been modified; please apply your changes to the latest version and try again E0113 12:37:08.092446 58414 clusterroleaggregation_controller.go:181] edit failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "edit": the object has been modified; please apply your changes to the latest version and try again +++ Running case: test-cmd.run_kubectl_version_tests +++ working dir: /home/prow/go/src/k8s.io/kubernetes +++ command: run_kubectl_version_tests E0113 12:37:08.110627 58414 clusterroleaggregation_controller.go:181] edit failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "edit": the object has been modified; please apply your changes to the latest version and try again +++ [0113 12:37:08] Testing kubectl version { "major": "1", "minor": "21+", "gitVersion": "v1.21.0-alpha.0.790+f6e04cd3ad1f3a", "gitCommit": "f6e04cd3ad1f3a1a00fd091e1524d2d27dc56bd7", ... skipping 114 lines ... +++ working dir: /home/prow/go/src/k8s.io/kubernetes +++ command: run_RESTMapper_evaluation_tests +++ [0113 12:37:13] Creating namespace namespace-1610541433-3845 namespace/namespace-1610541433-3845 created Context "test" modified. +++ [0113 12:37:13] Testing RESTMapper +++ [0113 12:37:13] "kubectl get unknownresourcetype" returns error as expected: error: the server doesn't have a resource type "unknownresourcetype" +++ exit code: 0 NAME SHORTNAMES APIVERSION NAMESPACED KIND bindings v1 true Binding componentstatuses cs v1 false ComponentStatus configmaps cm v1 true ConfigMap endpoints ep v1 true Endpoints ... skipping 62 lines ... namespace/namespace-1610541437-26993 created Context "test" modified. +++ [0113 12:37:17] Testing clusterroles [32mrbac.sh:29: Successful get clusterroles/cluster-admin {{.metadata.name}}: cluster-admin (B[m[32mrbac.sh:30: Successful get clusterrolebindings/cluster-admin {{.metadata.name}}: cluster-admin (B[mSuccessful message:Error from server (NotFound): clusterroles.rbac.authorization.k8s.io "pod-admin" not found has:clusterroles.rbac.authorization.k8s.io "pod-admin" not found clusterrole.rbac.authorization.k8s.io/pod-admin created (dry run) clusterrole.rbac.authorization.k8s.io/pod-admin created (server dry run) Successful message:Error from server (NotFound): clusterroles.rbac.authorization.k8s.io "pod-admin" not found has:clusterroles.rbac.authorization.k8s.io "pod-admin" not found clusterrole.rbac.authorization.k8s.io/pod-admin created [32mrbac.sh:42: Successful get clusterrole/pod-admin {{range.rules}}{{range.verbs}}{{.}}:{{end}}{{end}}: *: (B[mSuccessful message:warning: deleting cluster-scoped resources, not scoped to the provided namespace clusterrole.rbac.authorization.k8s.io "pod-admin" deleted ... skipping 18 lines ... (B[mclusterrole.rbac.authorization.k8s.io/url-reader created [32mrbac.sh:61: Successful get clusterrole/url-reader {{range.rules}}{{range.verbs}}{{.}}:{{end}}{{end}}: get: (B[m[32mrbac.sh:62: Successful get clusterrole/url-reader {{range.rules}}{{range.nonResourceURLs}}{{.}}:{{end}}{{end}}: /logs/*:/healthz/*: (B[mclusterrole.rbac.authorization.k8s.io/aggregation-reader created [32mrbac.sh:64: Successful get clusterrole/aggregation-reader {{.metadata.name}}: aggregation-reader (B[mSuccessful message:Error from server (NotFound): clusterrolebindings.rbac.authorization.k8s.io "super-admin" not found has:clusterrolebindings.rbac.authorization.k8s.io "super-admin" not found clusterrolebinding.rbac.authorization.k8s.io/super-admin created (dry run) clusterrolebinding.rbac.authorization.k8s.io/super-admin created (server dry run) Successful message:Error from server (NotFound): clusterrolebindings.rbac.authorization.k8s.io "super-admin" not found has:clusterrolebindings.rbac.authorization.k8s.io "super-admin" not found clusterrolebinding.rbac.authorization.k8s.io/super-admin created [32mrbac.sh:77: Successful get clusterrolebinding/super-admin {{range.subjects}}{{.name}}:{{end}}: super-admin: (B[mclusterrolebinding.rbac.authorization.k8s.io/super-admin subjects updated (dry run) clusterrolebinding.rbac.authorization.k8s.io/super-admin subjects updated (server dry run) [32mrbac.sh:80: Successful get clusterrolebinding/super-admin {{range.subjects}}{{.name}}:{{end}}: super-admin: ... skipping 62 lines ... [32mrbac.sh:102: Successful get clusterrolebinding/super-admin {{range.subjects}}{{.name}}:{{end}}: super-admin:foo:test-all-user: (B[m[32mrbac.sh:103: Successful get clusterrolebinding/super-group {{range.subjects}}{{.name}}:{{end}}: the-group:foo:test-all-user: (B[m[32mrbac.sh:104: Successful get clusterrolebinding/super-sa {{range.subjects}}{{.name}}:{{end}}: sa-name:foo:test-all-user: (B[mrolebinding.rbac.authorization.k8s.io/admin created (dry run) rolebinding.rbac.authorization.k8s.io/admin created (server dry run) Successful message:Error from server (NotFound): rolebindings.rbac.authorization.k8s.io "admin" not found has: not found rolebinding.rbac.authorization.k8s.io/admin created [32mrbac.sh:113: Successful get rolebinding/admin {{.roleRef.kind}}: ClusterRole (B[m[32mrbac.sh:114: Successful get rolebinding/admin {{range.subjects}}{{.name}}:{{end}}: default-admin: (B[mrolebinding.rbac.authorization.k8s.io/admin subjects updated [32mrbac.sh:116: Successful get rolebinding/admin {{range.subjects}}{{.name}}:{{end}}: default-admin:foo: ... skipping 29 lines ... message:Warning: rbac.authorization.k8s.io/v1beta1 Role is deprecated in v1.17+, unavailable in v1.22+; use rbac.authorization.k8s.io/v1 Role No resources found in namespace-1610541445-17731 namespace. has:Role is deprecated Successful message:Warning: rbac.authorization.k8s.io/v1beta1 Role is deprecated in v1.17+, unavailable in v1.22+; use rbac.authorization.k8s.io/v1 Role No resources found in namespace-1610541445-17731 namespace. Error: 1 warning received has:Role is deprecated Successful message:Warning: rbac.authorization.k8s.io/v1beta1 Role is deprecated in v1.17+, unavailable in v1.22+; use rbac.authorization.k8s.io/v1 Role No resources found in namespace-1610541445-17731 namespace. Error: 1 warning received has:Error: 1 warning received role.rbac.authorization.k8s.io/pod-admin created (dry run) role.rbac.authorization.k8s.io/pod-admin created (server dry run) Successful message:Error from server (NotFound): roles.rbac.authorization.k8s.io "pod-admin" not found has: not found role.rbac.authorization.k8s.io/pod-admin created [32mrbac.sh:163: Successful get role/pod-admin {{range.rules}}{{range.verbs}}{{.}}:{{end}}{{end}}: *: (B[m[32mrbac.sh:164: Successful get role/pod-admin {{range.rules}}{{range.resources}}{{.}}:{{end}}{{end}}: pods: (B[m[32mrbac.sh:165: Successful get role/pod-admin {{range.rules}}{{range.apiGroups}}{{.}}:{{end}}{{end}}: : (B[mSuccessful ... skipping 460 lines ... has:valid-pod Successful message:NAME READY STATUS RESTARTS AGE valid-pod 0/1 Pending 0 0s has:valid-pod [32mcore.sh:190: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod: (B[merror: resource(s) were provided, but no name, label selector, or --all flag specified [32mcore.sh:194: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod: (B[m[32mcore.sh:198: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod: (B[merror: setting 'all' parameter but found a non empty selector. [32mcore.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod: (B[m[32mcore.sh:206: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod: (B[mwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely. pod "valid-pod" force deleted [32mcore.sh:210: Successful get pods -l'name in (valid-pod)' {{range.items}}{{.metadata.name}}:{{end}}: (B[m[32mcore.sh:215: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-kubectl-describe-pod\" }}found{{end}}{{end}}:: : ... skipping 22 lines ... I0113 12:37:39.897190 54773 client.go:360] parsed scheme: "passthrough" I0113 12:37:39.897240 54773 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{http://127.0.0.1:2379 <nil> 0 <nil>}] <nil> <nil>} I0113 12:37:39.897253 54773 clientconn.go:948] ClientConn switching balancer to "pick_first" [32mcore.sh:265: Successful get pdb/test-pdb-3 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 2 (B[mpoddisruptionbudget.policy/test-pdb-4 created [32mcore.sh:269: Successful get pdb/test-pdb-4 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 50% (B[merror: min-available and max-unavailable cannot be both specified [32mcore.sh:275: Successful get pods --namespace=test-kubectl-describe-pod {{range.items}}{{.metadata.name}}:{{end}}: (B[mpod/env-test-pod created matched TEST_CMD_1 matched <set to the key 'key-1' in secret 'test-secret'> matched TEST_CMD_2 matched <set to the key 'key-2' of config map 'test-configmap'> ... skipping 221 lines ... [32mcore.sh:534: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:3.2: (B[mSuccessful message:kubectl-create kubectl-patch has:kubectl-patch pod/valid-pod patched [32mcore.sh:554: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx: (B[m+++ [0113 12:37:58] "kubectl patch with resourceVersion 588" returns error as expected: Error from server (Conflict): Operation cannot be fulfilled on pods "valid-pod": the object has been modified; please apply your changes to the latest version and try again pod "valid-pod" deleted pod/valid-pod replaced [32mcore.sh:578: Successful get pod valid-pod {{(index .spec.containers 0).name}}: replaced-k8s-serve-hostname (B[mSuccessful message:kubectl-create kubectl-patch kubectl-replace has:kubectl-replace Successful message:error: --grace-period must have --force specified has:\-\-grace-period must have \-\-force specified Successful message:error: --timeout must have --force specified has:\-\-timeout must have \-\-force specified node/node-v1-test created W0113 12:37:59.272392 58414 actual_state_of_world.go:534] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="node-v1-test" does not exist [32mcore.sh:606: Successful get node node-v1-test {{range.items}}{{if .metadata.annotations.a}}found{{end}}{{end}}:: : (B[mnode/node-v1-test replaced (server dry run) node/node-v1-test replaced (dry run) [32mcore.sh:631: Successful get node node-v1-test {{range.items}}{{if .metadata.annotations.a}}found{{end}}{{end}}:: : (B[mnode/node-v1-test replaced [32mcore.sh:647: Successful get node node-v1-test {{.metadata.annotations.a}}: b ... skipping 29 lines ... spec: containers: - image: k8s.gcr.io/pause:2.0 name: kubernetes-pause has:localonlyvalue [32mcore.sh:683: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod (B[merror: 'name' already has a value (valid-pod), and --overwrite is false [32mcore.sh:687: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod (B[m[32mcore.sh:691: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod (B[mpod/valid-pod labeled [32mcore.sh:695: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod-super-sayan (B[m[32mcore.sh:699: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod: (B[mwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely. ... skipping 85 lines ... +++ Running case: test-cmd.run_kubectl_create_error_tests +++ working dir: /home/prow/go/src/k8s.io/kubernetes +++ command: run_kubectl_create_error_tests +++ [0113 12:38:10] Creating namespace namespace-1610541490-18317 namespace/namespace-1610541490-18317 created Context "test" modified. +++ [0113 12:38:10] Testing kubectl create with error Error: must specify one of -f and -k Create a resource from a file or from stdin. JSON and YAML formats are accepted. Examples: ... skipping 43 lines ... Usage: kubectl create -f FILENAME [options] Use "kubectl <command> --help" for more information about a given command. Use "kubectl options" for a list of global command-line options (applies to all commands). +++ [0113 12:38:11] "kubectl create with empty string list returns error as expected: error: error validating "hack/testdata/invalid-rc-with-empty-args.yaml": error validating data: ValidationError(ReplicationController.spec.template.spec.containers[0].args): unknown object type "nil" in ReplicationController.spec.template.spec.containers[0].args[0]; if you choose to ignore these errors, turn validation off with --validate=false +++ exit code: 0 Recording: run_kubectl_apply_tests Running command: run_kubectl_apply_tests +++ Running case: test-cmd.run_kubectl_apply_tests +++ working dir: /home/prow/go/src/k8s.io/kubernetes ... skipping 29 lines ... I0113 12:38:14.804842 58414 event.go:291] "Event occurred" object="namespace-1610541491-30938/test-deployment-retainkeys-8695b756f8" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: test-deployment-retainkeys-8695b756f8-dcl4r" deployment.apps "test-deployment-retainkeys" deleted [32mapply.sh:88: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: (B[mpod/selector-test-pod created [32mapply.sh:92: Successful get pods selector-test-pod {{.metadata.labels.name}}: selector-test-pod (B[mSuccessful message:Error from server (NotFound): pods "selector-test-pod-dont-apply" not found has:pods "selector-test-pod-dont-apply" not found pod "selector-test-pod" deleted [32mapply.sh:101: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: (B[mW0113 12:38:16.013144 66837 helpers.go:567] --dry-run=true is deprecated (boolean value) and can be replaced with --dry-run=client. pod/test-pod created (dry run) pod/test-pod created (dry run) ... skipping 37 lines ... (B[mpod "a" deleted pod "b" deleted I0113 12:38:22.418303 54773 client.go:360] parsed scheme: "passthrough" I0113 12:38:22.418349 54773 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{http://127.0.0.1:2379 <nil> 0 <nil>}] <nil> <nil>} I0113 12:38:22.418358 54773 clientconn.go:948] ClientConn switching balancer to "pick_first" Successful message:error: all resources selected for prune without explicitly passing --all. To prune all resources, pass the --all flag. If you did not mean to prune all resources, specify a label selector has:all resources selected for prune without explicitly passing --all pod/a created pod/b created service/prune-svc created Warning: extensions/v1beta1 Ingress is deprecated in v1.14+, unavailable in v1.22+; use networking.k8s.io/v1 Ingress I0113 12:38:25.187109 58414 horizontal.go:359] Horizontal Pod Autoscaler frontend has been deleted in namespace-1610541488-30081 ... skipping 41 lines ... (B[mpod/b unchanged pod/a pruned Warning: extensions/v1beta1 Ingress is deprecated in v1.14+, unavailable in v1.22+; use networking.k8s.io/v1 Ingress [32mapply.sh:254: Successful get pods -n nsb {{range.items}}{{.metadata.name}}:{{end}}: b: (B[mnamespace "nsb" deleted Successful message:error: the namespace from the provided object "nsb" does not match the namespace "foo". You must pass '--namespace=nsb' to perform this operation. has:the namespace from the provided object "nsb" does not match the namespace "foo". [32mapply.sh:265: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: (B[mservice/a created [32mapply.sh:269: Successful get services a {{.metadata.name}}: a (B[mSuccessful message:The Service "a" is invalid: spec.clusterIPs[0]: Invalid value: []string{"10.0.0.12"}: may not change once set ... skipping 25 lines ... (B[m[32mapply.sh:291: Successful get deployment test-the-deployment {{.metadata.name}}: test-the-deployment (B[m[32mapply.sh:292: Successful get service test-the-service {{.metadata.name}}: test-the-service (B[mconfigmap "test-the-map" deleted service "test-the-service" deleted deployment.apps "test-the-deployment" deleted Successful message:Error from server (NotFound): namespaces "multi-resource-ns" not found has:namespaces "multi-resource-ns" not found [32mapply.sh:300: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: (B[mSuccessful message:namespace/multi-resource-ns created Error from server (NotFound): error when creating "hack/testdata/multi-resource-1.yaml": namespaces "multi-resource-ns" not found has:namespaces "multi-resource-ns" not found Successful message:Error from server (NotFound): pods "test-pod" not found has:pods "test-pod" not found pod/test-pod created namespace/multi-resource-ns unchanged [32mapply.sh:308: Successful get pods test-pod -n multi-resource-ns {{.metadata.name}}: test-pod (B[mpod "test-pod" deleted namespace "multi-resource-ns" deleted I0113 12:38:52.522726 58414 namespace_controller.go:185] Namespace has been deleted nsb [32mapply.sh:314: Successful get configmaps --field-selector=metadata.name=foo {{range.items}}{{.metadata.name}}:{{end}}: (B[mI0113 12:38:56.450071 54773 client.go:360] parsed scheme: "passthrough" I0113 12:38:56.450131 54773 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{http://127.0.0.1:2379 <nil> 0 <nil>}] <nil> <nil>} I0113 12:38:56.450145 54773 clientconn.go:948] ClientConn switching balancer to "pick_first" Successful message:configmap/foo created error: unable to recognize "hack/testdata/multi-resource-2.yaml": no matches for kind "Bogus" in version "example.com/v1" has:no matches for kind "Bogus" in version "example.com/v1" [32mapply.sh:320: Successful get configmaps foo {{.metadata.name}}: foo (B[mconfigmap "foo" deleted [32mapply.sh:326: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: (B[mSuccessful message:pod/pod-a created ... skipping 6 lines ... pod "pod-c" deleted [32mapply.sh:334: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: (B[m[32mapply.sh:338: Successful get crds {{range.items}}{{.metadata.name}}:{{end}}: (B[mSuccessful message:Warning: apiextensions.k8s.io/v1beta1 CustomResourceDefinition is deprecated in v1.16+, unavailable in v1.22+; use apiextensions.k8s.io/v1 CustomResourceDefinition customresourcedefinition.apiextensions.k8s.io/widgets.example.com created error: unable to recognize "hack/testdata/multi-resource-4.yaml": no matches for kind "Widget" in version "example.com/v1" has:no matches for kind "Widget" in version "example.com/v1" I0113 12:38:58.596503 54773 client.go:360] parsed scheme: "endpoint" I0113 12:38:58.596546 54773 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 <nil> 0 <nil>}] Successful message:Error from server (NotFound): widgets.example.com "foo" not found has:widgets.example.com "foo" not found [32mapply.sh:344: Successful get crds widgets.example.com {{.metadata.name}}: widgets.example.com (B[mI0113 12:38:58.927395 54773 controller.go:609] quota admission added evaluator for: widgets.example.com widget.example.com/foo created Warning: apiextensions.k8s.io/v1beta1 CustomResourceDefinition is deprecated in v1.16+, unavailable in v1.22+; use apiextensions.k8s.io/v1 CustomResourceDefinition customresourcedefinition.apiextensions.k8s.io/widgets.example.com unchanged ... skipping 34 lines ... message:869 has:869 pod "test-pod" deleted [32mapply.sh:403: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: (B[m+++ [0113 12:39:02] Testing upgrade kubectl client-side apply to server-side apply pod/test-pod created error: Apply failed with 1 conflict: conflict with "kubectl-client-side-apply" using v1: .metadata.labels.name Please review the fields above--they currently have other managers. Here are the ways you can resolve this warning: * If you intend to manage all of these fields, please re-run the apply command with the `--force-conflicts` flag. * If you do not intend to manage all of the fields, please edit your manifest to remove references to the fields that should keep their ... skipping 79 lines ... (B[mpod "nginx-extensions" deleted Successful message:pod/test1 created has:pod/test1 created pod "test1" deleted Successful message:error: Invalid image name "InvalidImageName": invalid reference format has:error: Invalid image name "InvalidImageName": invalid reference format +++ exit code: 0 Recording: run_kubectl_create_filter_tests Running command: run_kubectl_create_filter_tests +++ Running case: test-cmd.run_kubectl_create_filter_tests +++ working dir: /home/prow/go/src/k8s.io/kubernetes ... skipping 3 lines ... Context "test" modified. +++ [0113 12:39:08] Testing kubectl create filter [32mcreate.sh:50: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: (B[mpod/selector-test-pod created [32mcreate.sh:54: Successful get pods selector-test-pod {{.metadata.labels.name}}: selector-test-pod (B[mSuccessful message:Error from server (NotFound): pods "selector-test-pod-dont-apply" not found has:pods "selector-test-pod-dont-apply" not found pod "selector-test-pod" deleted +++ exit code: 0 Recording: run_kubectl_apply_deployments_tests Running command: run_kubectl_apply_deployments_tests ... skipping 29 lines ... I0113 12:39:11.933683 58414 event.go:291] "Event occurred" object="namespace-1610541548-9068/nginx" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-9bb9c4878 to 3" I0113 12:39:11.938715 58414 event.go:291] "Event occurred" object="namespace-1610541548-9068/nginx-9bb9c4878" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-9bb9c4878-dlztc" I0113 12:39:11.941157 58414 event.go:291] "Event occurred" object="namespace-1610541548-9068/nginx-9bb9c4878" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-9bb9c4878-grk9f" I0113 12:39:11.942617 58414 event.go:291] "Event occurred" object="namespace-1610541548-9068/nginx-9bb9c4878" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-9bb9c4878-c8b29" [32mapps.sh:152: Successful get deployment nginx {{.metadata.name}}: nginx (B[mSuccessful message:Error from server (Conflict): error when applying patch: {"metadata":{"annotations":{"kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1610541548-9068\",\"resourceVersion\":\"99\"},\"spec\":{\"replicas\":3,\"selector\":{\"matchLabels\":{\"name\":\"nginx2\"}},\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx2\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n"},"resourceVersion":"99"},"spec":{"selector":{"matchLabels":{"name":"nginx2"}},"template":{"metadata":{"labels":{"name":"nginx2"}}}}} to: Resource: "apps/v1, Resource=deployments", GroupVersionKind: "apps/v1, Kind=Deployment" Name: "nginx", Namespace: "namespace-1610541548-9068" for: "hack/testdata/deployment-label-change2.yaml": Operation cannot be fulfilled on deployments.apps "nginx": the object has been modified; please apply your changes to the latest version and try again has:Error from server (Conflict) deployment.apps/nginx configured I0113 12:39:20.583190 58414 event.go:291] "Event occurred" object="namespace-1610541548-9068/nginx" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-6dd6cfdb57 to 3" I0113 12:39:20.589416 58414 event.go:291] "Event occurred" object="namespace-1610541548-9068/nginx-6dd6cfdb57" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-6dd6cfdb57-zk8r4" I0113 12:39:20.594329 58414 event.go:291] "Event occurred" object="namespace-1610541548-9068/nginx-6dd6cfdb57" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-6dd6cfdb57-nx2wg" I0113 12:39:20.595174 58414 event.go:291] "Event occurred" object="namespace-1610541548-9068/nginx-6dd6cfdb57" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-6dd6cfdb57-wbsnd" Successful ... skipping 303 lines ... +++ [0113 12:39:29] Creating namespace namespace-1610541569-32708 namespace/namespace-1610541569-32708 created Context "test" modified. +++ [0113 12:39:29] Testing kubectl get [32mget.sh:29: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: (B[mSuccessful message:Error from server (NotFound): pods "abc" not found has:pods "abc" not found [32mget.sh:37: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: (B[mSuccessful message:Error from server (NotFound): pods "abc" not found has:pods "abc" not found [32mget.sh:45: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: (B[mSuccessful message:{ "apiVersion": "v1", "items": [], ... skipping 23 lines ... has not:No resources found Successful message:NAME has not:No resources found [32mget.sh:73: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: (B[mSuccessful message:error: the server doesn't have a resource type "foobar" has not:No resources found Successful message:No resources found in namespace-1610541569-32708 namespace. has:No resources found Successful message: has not:No resources found Successful message:No resources found in namespace-1610541569-32708 namespace. has:No resources found [32mget.sh:93: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: (B[mSuccessful message:Error from server (NotFound): pods "abc" not found has:pods "abc" not found Successful message:Error from server (NotFound): pods "abc" not found has not:List Successful message:I0113 12:39:31.711772 70337 loader.go:372] Config loaded from file: /tmp/tmp.oYHBCudpFw/.kube/config I0113 12:39:31.717546 70337 round_trippers.go:445] GET https://127.0.0.1:6443/version?timeout=32s 200 OK in 5 milliseconds I0113 12:39:31.741116 70337 round_trippers.go:445] GET https://127.0.0.1:6443/api/v1/namespaces/default/pods 200 OK in 1 milliseconds I0113 12:39:31.742764 70337 round_trippers.go:445] GET https://127.0.0.1:6443/api/v1/namespaces/default/replicationcontrollers 200 OK in 1 milliseconds ... skipping 636 lines ... } [32mget.sh:158: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod: (B[m<no value>Successful message:valid-pod: has:valid-pod: Successful message:error: error executing jsonpath "{.missing}": Error executing template: missing is not found. Printing more information for debugging the template: template was: {.missing} object given to jsonpath engine was: map[string]interface {}{"apiVersion":"v1", "kind":"Pod", "metadata":map[string]interface {}{"creationTimestamp":"2021-01-13T12:39:39Z", "labels":map[string]interface {}{"name":"valid-pod"}, "managedFields":[]interface {}{map[string]interface {}{"apiVersion":"v1", "fieldsType":"FieldsV1", "fieldsV1":map[string]interface {}{"f:metadata":map[string]interface {}{"f:labels":map[string]interface {}{".":map[string]interface {}{}, "f:name":map[string]interface {}{}}}, "f:spec":map[string]interface {}{"f:containers":map[string]interface {}{"k:{\"name\":\"kubernetes-serve-hostname\"}":map[string]interface {}{".":map[string]interface {}{}, "f:image":map[string]interface {}{}, "f:imagePullPolicy":map[string]interface {}{}, "f:name":map[string]interface {}{}, "f:resources":map[string]interface {}{".":map[string]interface {}{}, "f:limits":map[string]interface {}{".":map[string]interface {}{}, "f:cpu":map[string]interface {}{}, "f:memory":map[string]interface {}{}}, "f:requests":map[string]interface {}{".":map[string]interface {}{}, "f:cpu":map[string]interface {}{}, "f:memory":map[string]interface {}{}}}, "f:terminationMessagePath":map[string]interface {}{}, "f:terminationMessagePolicy":map[string]interface {}{}}}, "f:dnsPolicy":map[string]interface {}{}, "f:enableServiceLinks":map[string]interface {}{}, "f:restartPolicy":map[string]interface {}{}, "f:schedulerName":map[string]interface {}{}, "f:securityContext":map[string]interface {}{}, "f:terminationGracePeriodSeconds":map[string]interface {}{}}}, "manager":"kubectl-create", "operation":"Update", "time":"2021-01-13T12:39:39Z"}}, "name":"valid-pod", "namespace":"namespace-1610541579-12645", "resourceVersion":"1040", "uid":"0ea950ce-f325-4845-ac52-717610edc27d"}, "spec":map[string]interface {}{"containers":[]interface {}{map[string]interface {}{"image":"k8s.gcr.io/serve_hostname", "imagePullPolicy":"Always", "name":"kubernetes-serve-hostname", "resources":map[string]interface {}{"limits":map[string]interface {}{"cpu":"1", "memory":"512Mi"}, "requests":map[string]interface {}{"cpu":"1", "memory":"512Mi"}}, "terminationMessagePath":"/dev/termination-log", "terminationMessagePolicy":"File"}}, "dnsPolicy":"ClusterFirst", "enableServiceLinks":true, "preemptionPolicy":"PreemptLowerPriority", "priority":0, "restartPolicy":"Always", "schedulerName":"default-scheduler", "securityContext":map[string]interface {}{}, "terminationGracePeriodSeconds":30}, "status":map[string]interface {}{"phase":"Pending", "qosClass":"Guaranteed"}} has:missing is not found error: error executing template "{{.missing}}": template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing" Successful message:Error executing template: template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing". Printing more information for debugging the template: template was: {{.missing}} raw data was: {"apiVersion":"v1","kind":"Pod","metadata":{"creationTimestamp":"2021-01-13T12:39:39Z","labels":{"name":"valid-pod"},"managedFields":[{"apiVersion":"v1","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:labels":{".":{},"f:name":{}}},"f:spec":{"f:containers":{"k:{\"name\":\"kubernetes-serve-hostname\"}":{".":{},"f:image":{},"f:imagePullPolicy":{},"f:name":{},"f:resources":{".":{},"f:limits":{".":{},"f:cpu":{},"f:memory":{}},"f:requests":{".":{},"f:cpu":{},"f:memory":{}}},"f:terminationMessagePath":{},"f:terminationMessagePolicy":{}}},"f:dnsPolicy":{},"f:enableServiceLinks":{},"f:restartPolicy":{},"f:schedulerName":{},"f:securityContext":{},"f:terminationGracePeriodSeconds":{}}},"manager":"kubectl-create","operation":"Update","time":"2021-01-13T12:39:39Z"}],"name":"valid-pod","namespace":"namespace-1610541579-12645","resourceVersion":"1040","uid":"0ea950ce-f325-4845-ac52-717610edc27d"},"spec":{"containers":[{"image":"k8s.gcr.io/serve_hostname","imagePullPolicy":"Always","name":"kubernetes-serve-hostname","resources":{"limits":{"cpu":"1","memory":"512Mi"},"requests":{"cpu":"1","memory":"512Mi"}},"terminationMessagePath":"/dev/termination-log","terminationMessagePolicy":"File"}],"dnsPolicy":"ClusterFirst","enableServiceLinks":true,"preemptionPolicy":"PreemptLowerPriority","priority":0,"restartPolicy":"Always","schedulerName":"default-scheduler","securityContext":{},"terminationGracePeriodSeconds":30},"status":{"phase":"Pending","qosClass":"Guaranteed"}} object given to template engine was: map[apiVersion:v1 kind:Pod metadata:map[creationTimestamp:2021-01-13T12:39:39Z labels:map[name:valid-pod] managedFields:[map[apiVersion:v1 fieldsType:FieldsV1 fieldsV1:map[f:metadata:map[f:labels:map[.:map[] f:name:map[]]] f:spec:map[f:containers:map[k:{"name":"kubernetes-serve-hostname"}:map[.:map[] f:image:map[] f:imagePullPolicy:map[] f:name:map[] f:resources:map[.:map[] f:limits:map[.:map[] f:cpu:map[] f:memory:map[]] f:requests:map[.:map[] f:cpu:map[] f:memory:map[]]] f:terminationMessagePath:map[] f:terminationMessagePolicy:map[]]] f:dnsPolicy:map[] f:enableServiceLinks:map[] f:restartPolicy:map[] f:schedulerName:map[] f:securityContext:map[] f:terminationGracePeriodSeconds:map[]]] manager:kubectl-create operation:Update time:2021-01-13T12:39:39Z]] name:valid-pod namespace:namespace-1610541579-12645 resourceVersion:1040 uid:0ea950ce-f325-4845-ac52-717610edc27d] spec:map[containers:[map[image:k8s.gcr.io/serve_hostname imagePullPolicy:Always name:kubernetes-serve-hostname resources:map[limits:map[cpu:1 memory:512Mi] requests:map[cpu:1 memory:512Mi]] terminationMessagePath:/dev/termination-log terminationMessagePolicy:File]] dnsPolicy:ClusterFirst enableServiceLinks:true preemptionPolicy:PreemptLowerPriority priority:0 restartPolicy:Always schedulerName:default-scheduler securityContext:map[] terminationGracePeriodSeconds:30] status:map[phase:Pending qosClass:Guaranteed]] ... skipping 156 lines ... terminationGracePeriodSeconds: 30 status: phase: Pending qosClass: Guaranteed has:name: valid-pod Successful message:Error from server (NotFound): pods "invalid-pod" not found has:"invalid-pod" not found pod "valid-pod" deleted [32mget.sh:196: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: (B[mpod/redis-master created pod/valid-pod created Successful ... skipping 36 lines ... +++ [0113 12:39:45] Creating namespace namespace-1610541585-21390 namespace/namespace-1610541585-21390 created Context "test" modified. +++ [0113 12:39:45] Testing kubectl exec POD COMMAND Successful message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead. Error from server (NotFound): pods "abc" not found has:pods "abc" not found pod/test-pod created Successful message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead. Error from server (BadRequest): pod test-pod does not have a host assigned has not:pods "test-pod" not found Successful message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead. Error from server (BadRequest): pod test-pod does not have a host assigned has not:pod or type/name must be specified pod "test-pod" deleted +++ exit code: 0 Recording: run_kubectl_exec_resource_name_tests Running command: run_kubectl_exec_resource_name_tests ... skipping 3 lines ... +++ [0113 12:39:46] Creating namespace namespace-1610541586-31708 namespace/namespace-1610541586-31708 created Context "test" modified. +++ [0113 12:39:46] Testing kubectl exec TYPE/NAME COMMAND Successful message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead. error: the server doesn't have a resource type "foo" has:error: Successful message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead. Error from server (NotFound): deployments.apps "bar" not found has:"bar" not found pod/test-pod created replicaset.apps/frontend created I0113 12:39:46.939294 58414 event.go:291] "Event occurred" object="namespace-1610541586-31708/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-bphg2" I0113 12:39:46.946837 58414 event.go:291] "Event occurred" object="namespace-1610541586-31708/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-jfs4j" I0113 12:39:46.947395 58414 event.go:291] "Event occurred" object="namespace-1610541586-31708/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-ngn24" configmap/test-set-env-config created Successful message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead. error: cannot attach to *v1.ConfigMap: selector for *v1.ConfigMap not implemented has:not implemented Successful message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead. Error from server (BadRequest): pod test-pod does not have a host assigned has not:not found Successful message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead. Error from server (BadRequest): pod test-pod does not have a host assigned has not:pod, type/name or --filename must be specified Successful message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead. Error from server (BadRequest): pod frontend-bphg2 does not have a host assigned has not:not found Successful message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead. Error from server (BadRequest): pod frontend-bphg2 does not have a host assigned has not:pod, type/name or --filename must be specified pod "test-pod" deleted replicaset.apps "frontend" deleted configmap "test-set-env-config" deleted +++ exit code: 0 Recording: run_create_secret_tests Running command: run_create_secret_tests +++ Running case: test-cmd.run_create_secret_tests +++ working dir: /home/prow/go/src/k8s.io/kubernetes +++ command: run_create_secret_tests Successful message:Error from server (NotFound): secrets "mysecret" not found has:secrets "mysecret" not found Successful message:user-specified has:user-specified Successful message:Error from server (NotFound): secrets "mysecret" not found has:secrets "mysecret" not found Successful {"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","uid":"d0c4843a-6d35-40c8-a343-029ec871dc12","resourceVersion":"1120","creationTimestamp":"2021-01-13T12:39:48Z"}} Successful message:{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","uid":"d0c4843a-6d35-40c8-a343-029ec871dc12","resourceVersion":"1121","creationTimestamp":"2021-01-13T12:39:48Z"},"data":{"key1":"config1"}} has:uid Successful message:{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","uid":"d0c4843a-6d35-40c8-a343-029ec871dc12","resourceVersion":"1121","creationTimestamp":"2021-01-13T12:39:48Z"},"data":{"key1":"config1"}} has:config1 {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Success","details":{"name":"tester-update-cm","kind":"configmaps","uid":"d0c4843a-6d35-40c8-a343-029ec871dc12"}} Successful message:Error from server (NotFound): configmaps "tester-update-cm" not found has:configmaps "tester-update-cm" not found +++ exit code: 0 Recording: run_kubectl_create_kustomization_directory_tests Running command: run_kubectl_create_kustomization_directory_tests +++ Running case: test-cmd.run_kubectl_create_kustomization_directory_tests ... skipping 162 lines ... } [32mrequest-timeout.sh:34: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod: (B[mSuccessful message:NAME READY STATUS RESTARTS AGE valid-pod 0/1 Pending 0 1s has:valid-pod FAIL! message:NAME READY STATUS RESTARTS AGE valid-pod 0/1 Pending 0 1s I0113 12:39:53.301781 72023 streamwatcher.go:117] Unable to decode an event from the watch stream: context deadline exceeded has not:Timeout 42 /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../test/cmd/request-timeout.sh !!! [0113 12:39:53] Call tree: !!! [0113 12:39:53] 1: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:47 run_kubectl_request_timeout_tests(...) !!! [0113 12:39:53] 2: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:112 eVal(...) !!! [0113 12:39:53] 3: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:131 juLog(...) !!! [0113 12:39:53] 4: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:576 record_command(...) !!! [0113 12:39:53] 5: hack/make-rules/test-cmd.sh:184 runTests(...) +++ exit code: 1 +++ error: 1 Error when running run_kubectl_request_timeout_tests Recording: run_crd_tests Running command: run_crd_tests +++ Running case: test-cmd.run_crd_tests +++ working dir: /home/prow/go/src/k8s.io/kubernetes +++ command: run_crd_tests ... skipping 237 lines ... foo.company.com/test patched [32mcrd.sh:236: Successful get foos/test {{.patched}}: value1 (B[mfoo.company.com/test patched [32mcrd.sh:238: Successful get foos/test {{.patched}}: value2 (B[mfoo.company.com/test patched [32mcrd.sh:240: Successful get foos/test {{.patched}}: <no value> (B[m+++ [0113 12:39:59] "kubectl patch --local" returns error as expected for CustomResource: error: cannot apply strategic merge patch for company.com/v1, Kind=Foo locally, try --type merge { "apiVersion": "company.com/v1", "kind": "Foo", "metadata": { "annotations": { "kubernetes.io/change-cause": "kubectl patch foos/test --server=https://127.0.0.1:6443 --insecure-skip-tls-verify=true --match-server-version=true --patch={\"patched\":null} --type=merge --record=true" ... skipping 369 lines ... (B[m[32mcrd.sh:450: Successful get bars {{range.items}}{{.metadata.name}}:{{end}}: (B[mnamespace/non-native-resources created bar.company.com/test created [32mcrd.sh:455: Successful get bars {{len .items}}: 1 (B[mnamespace "non-native-resources" deleted [32mcrd.sh:458: Successful get bars {{len .items}}: 0 (B[mError from server (NotFound): namespaces "non-native-resources" not found customresourcedefinition.apiextensions.k8s.io "foos.company.com" deleted customresourcedefinition.apiextensions.k8s.io "bars.company.com" deleted customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted customresourcedefinition.apiextensions.k8s.io "validfoos.company.com" deleted +++ exit code: 0 +++ [0113 12:40:25] Testing recursive resources +++ [0113 12:40:25] Creating namespace namespace-1610541625-8318 namespace/namespace-1610541625-8318 created Context "test" modified. [32mgeneric-resources.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: (B[m[32mW0113 12:40:26.107484 54773 cacher.go:148] Terminating all watchers from cacher *unstructured.Unstructured generic-resources.sh:206: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1: (B[mE0113 12:40:26.109036 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource Successful message:pod/busybox0 created pod/busybox1 created error: error validating "hack/testdata/recursive/pod/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false has:error validating data: kind not set W0113 12:40:26.202708 54773 cacher.go:148] Terminating all watchers from cacher *unstructured.Unstructured E0113 12:40:26.204243 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource [32mgeneric-resources.sh:211: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1: (B[mW0113 12:40:26.299585 54773 cacher.go:148] Terminating all watchers from cacher *unstructured.Unstructured E0113 12:40:26.301141 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource [32mW0113 12:40:26.405906 54773 cacher.go:148] Terminating all watchers from cacher *unstructured.Unstructured E0113 12:40:26.407344 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource generic-resources.sh:220: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: busybox:busybox: (B[mSuccessful message:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}' has:Object 'Kind' is missing [32mgeneric-resources.sh:227: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1: (B[m[32mgeneric-resources.sh:231: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced: (B[mSuccessful message:pod/busybox0 replaced pod/busybox1 replaced error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false has:error validating data: kind not set [32mgeneric-resources.sh:236: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1: (B[mE0113 12:40:27.035785 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource Successful message:Name: busybox0 Namespace: namespace-1610541625-8318 Priority: 0 Node: <none> Labels: app=busybox0 ... skipping 154 lines ... Node-Selectors: <none> Tolerations: <none> Events: <none> unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}' has:Object 'Kind' is missing [32mgeneric-resources.sh:246: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1: (B[mE0113 12:40:27.259202 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource [32mgeneric-resources.sh:250: Successful get pods {{range.items}}{{.metadata.annotations.annotatekey}}:{{end}}: annotatevalue:annotatevalue: (B[mSuccessful message:pod/busybox0 annotated pod/busybox1 annotated error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}' has:Object 'Kind' is missing [32mgeneric-resources.sh:255: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1: (B[mE0113 12:40:27.633789 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource E0113 12:40:27.754327 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource [32mgeneric-resources.sh:259: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced: (B[mSuccessful message:Warning: resource pods/busybox0 is missing the kubectl.kubernetes.io/last-applied-configuration annotation which is required by kubectl apply. kubectl apply should only be used on resources created declaratively by either kubectl create --save-config or kubectl apply. The missing annotation will be patched automatically. pod/busybox0 configured Warning: resource pods/busybox1 is missing the kubectl.kubernetes.io/last-applied-configuration annotation which is required by kubectl apply. kubectl apply should only be used on resources created declaratively by either kubectl create --save-config or kubectl apply. The missing annotation will be patched automatically. pod/busybox1 configured error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false has:error validating data: kind not set [32mgeneric-resources.sh:265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: (B[mdeployment.apps/nginx created I0113 12:40:28.184054 58414 event.go:291] "Event occurred" object="namespace-1610541625-8318/nginx" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-54785cbcb8 to 3" I0113 12:40:28.187957 58414 event.go:291] "Event occurred" object="namespace-1610541625-8318/nginx-54785cbcb8" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-54785cbcb8-tnjg9" I0113 12:40:28.193108 58414 event.go:291] "Event occurred" object="namespace-1610541625-8318/nginx-54785cbcb8" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-54785cbcb8-jw44c" I0113 12:40:28.195040 58414 event.go:291] "Event occurred" object="namespace-1610541625-8318/nginx-54785cbcb8" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-54785cbcb8-6qrhv" ... skipping 40 lines ... restartPolicy: Always schedulerName: default-scheduler securityContext: {} terminationGracePeriodSeconds: 30 status: {} has:extensions/v1beta1 E0113 12:40:28.661728 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource deployment.apps "nginx" deleted [32mgeneric-resources.sh:281: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1: (B[m[32mgeneric-resources.sh:285: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1: (B[mSuccessful message:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}' has:Object 'Kind' is missing [32mgeneric-resources.sh:290: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1: (B[mSuccessful message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}' has:busybox0:busybox1: Successful message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}' has:Object 'Kind' is missing [32mE0113 12:40:29.394851 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource generic-resources.sh:299: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1: (B[mpod/busybox0 labeled pod/busybox1 labeled error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}' [32mgeneric-resources.sh:304: Successful get pods {{range.items}}{{.metadata.labels.mylabel}}:{{end}}: myvalue:myvalue: (B[mSuccessful message:pod/busybox0 labeled pod/busybox1 labeled error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}' has:Object 'Kind' is missing E0113 12:40:29.633304 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource E0113 12:40:29.660820 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource [32mgeneric-resources.sh:309: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1: (B[mpod/busybox0 patched pod/busybox1 patched error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}' I0113 12:40:29.832572 58414 namespace_controller.go:185] Namespace has been deleted non-native-resources [32mgeneric-resources.sh:314: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: prom/busybox:prom/busybox: (B[mSuccessful message:pod/busybox0 patched pod/busybox1 patched error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}' has:Object 'Kind' is missing [32mgeneric-resources.sh:319: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1: (B[m[32mgeneric-resources.sh:323: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: (B[mSuccessful message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely. pod "busybox0" force deleted pod "busybox1" force deleted error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}' has:Object 'Kind' is missing [32mgeneric-resources.sh:328: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: (B[mreplicationcontroller/busybox0 created I0113 12:40:30.554256 58414 event.go:291] "Event occurred" object="namespace-1610541625-8318/busybox0" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox0-p5d7r" replicationcontroller/busybox1 created error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false I0113 12:40:30.559371 58414 event.go:291] "Event occurred" object="namespace-1610541625-8318/busybox1" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox1-t8pk8" [32mgeneric-resources.sh:332: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1: (B[m[32mgeneric-resources.sh:337: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1: (B[m[32mgeneric-resources.sh:338: Successful get rc busybox0 {{.spec.replicas}}: 1 (B[m[32mgeneric-resources.sh:339: Successful get rc busybox1 {{.spec.replicas}}: 1 (B[m[32mgeneric-resources.sh:344: Successful get hpa busybox0 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80 (B[m[32mgeneric-resources.sh:345: Successful get hpa busybox1 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80 (B[mSuccessful message:horizontalpodautoscaler.autoscaling/busybox0 autoscaled horizontalpodautoscaler.autoscaling/busybox1 autoscaled error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}' has:Object 'Kind' is missing horizontalpodautoscaler.autoscaling "busybox0" deleted horizontalpodautoscaler.autoscaling "busybox1" deleted [32mgeneric-resources.sh:353: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1: (B[m[32mgeneric-resources.sh:354: Successful get rc busybox0 {{.spec.replicas}}: 1 (B[m[32mgeneric-resources.sh:355: Successful get rc busybox1 {{.spec.replicas}}: 1 (B[m[32mgeneric-resources.sh:359: Successful get service busybox0 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80 (B[m[32mgeneric-resources.sh:360: Successful get service busybox1 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80 (B[mSuccessful message:service/busybox0 exposed service/busybox1 exposed error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}' has:Object 'Kind' is missing [32mgeneric-resources.sh:366: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1: (B[m[32mgeneric-resources.sh:367: Successful get rc busybox0 {{.spec.replicas}}: 1 (B[m[32mgeneric-resources.sh:368: Successful get rc busybox1 {{.spec.replicas}}: 1 (B[mI0113 12:40:32.402616 58414 event.go:291] "Event occurred" object="namespace-1610541625-8318/busybox0" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox0-fwp9b" I0113 12:40:32.412460 58414 event.go:291] "Event occurred" object="namespace-1610541625-8318/busybox1" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox1-bjgl8" [32mgeneric-resources.sh:372: Successful get rc busybox0 {{.spec.replicas}}: 2 (B[m[32mgeneric-resources.sh:373: Successful get rc busybox1 {{.spec.replicas}}: 2 (B[mSuccessful message:replicationcontroller/busybox0 scaled replicationcontroller/busybox1 scaled error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}' has:Object 'Kind' is missing [32mgeneric-resources.sh:378: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1: (B[m[32mgeneric-resources.sh:382: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: (B[mSuccessful message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely. replicationcontroller "busybox0" force deleted replicationcontroller "busybox1" force deleted error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}' has:Object 'Kind' is missing [32mgeneric-resources.sh:387: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: (B[mE0113 12:40:33.075576 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource deployment.apps/nginx1-deployment created I0113 12:40:33.182045 58414 event.go:291] "Event occurred" object="namespace-1610541625-8318/nginx1-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx1-deployment-758b5949b6 to 2" deployment.apps/nginx0-deployment created error: error validating "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false I0113 12:40:33.186120 58414 event.go:291] "Event occurred" object="namespace-1610541625-8318/nginx1-deployment-758b5949b6" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx1-deployment-758b5949b6-8x7r4" I0113 12:40:33.188144 58414 event.go:291] "Event occurred" object="namespace-1610541625-8318/nginx0-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx0-deployment-75db9cdfd9 to 2" I0113 12:40:33.190980 58414 event.go:291] "Event occurred" object="namespace-1610541625-8318/nginx1-deployment-758b5949b6" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx1-deployment-758b5949b6-bq882" I0113 12:40:33.193505 58414 event.go:291] "Event occurred" object="namespace-1610541625-8318/nginx0-deployment-75db9cdfd9" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx0-deployment-75db9cdfd9-4mbb8" I0113 12:40:33.198562 58414 event.go:291] "Event occurred" object="namespace-1610541625-8318/nginx0-deployment-75db9cdfd9" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx0-deployment-75db9cdfd9-gb87l" [32mgeneric-resources.sh:391: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx0-deployment:nginx1-deployment: (B[mE0113 12:40:33.361434 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource [32mgeneric-resources.sh:392: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9: (B[m[32mgeneric-resources.sh:396: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9: (B[mSuccessful message:deployment.apps/nginx1-deployment skipped rollback (current template already matches revision 1) deployment.apps/nginx0-deployment skipped rollback (current template already matches revision 1) error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}' has:Object 'Kind' is missing deployment.apps/nginx1-deployment paused deployment.apps/nginx0-deployment paused [32mgeneric-resources.sh:404: Successful get deployment {{range.items}}{{.spec.paused}}:{{end}}: true:true: (B[mSuccessful message:unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}' ... skipping 10 lines ... 1 <none> deployment.apps/nginx0-deployment REVISION CHANGE-CAUSE 1 <none> error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}' has:nginx0-deployment Successful message:deployment.apps/nginx1-deployment REVISION CHANGE-CAUSE 1 <none> deployment.apps/nginx0-deployment REVISION CHANGE-CAUSE 1 <none> error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}' has:nginx1-deployment Successful message:deployment.apps/nginx1-deployment REVISION CHANGE-CAUSE 1 <none> deployment.apps/nginx0-deployment REVISION CHANGE-CAUSE 1 <none> error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}' has:Object 'Kind' is missing warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely. deployment.apps "nginx1-deployment" force deleted deployment.apps "nginx0-deployment" force deleted error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}' E0113 12:40:34.394310 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource [32mgeneric-resources.sh:426: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: (B[mE0113 12:40:35.371194 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource replicationcontroller/busybox0 created I0113 12:40:35.503324 58414 event.go:291] "Event occurred" object="namespace-1610541625-8318/busybox0" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox0-qfrdj" replicationcontroller/busybox1 created error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false I0113 12:40:35.509239 58414 event.go:291] "Event occurred" object="namespace-1610541625-8318/busybox1" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox1-zbrj6" [32mgeneric-resources.sh:430: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1: (B[mSuccessful message:no rollbacker has been implemented for "ReplicationController" no rollbacker has been implemented for "ReplicationController" unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}' ... skipping 2 lines ... message:no rollbacker has been implemented for "ReplicationController" no rollbacker has been implemented for "ReplicationController" unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}' has:Object 'Kind' is missing Successful message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}' error: replicationcontrollers "busybox0" pausing is not supported error: replicationcontrollers "busybox1" pausing is not supported has:Object 'Kind' is missing Successful message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}' error: replicationcontrollers "busybox0" pausing is not supported error: replicationcontrollers "busybox1" pausing is not supported has:replicationcontrollers "busybox0" pausing is not supported Successful message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}' error: replicationcontrollers "busybox0" pausing is not supported error: replicationcontrollers "busybox1" pausing is not supported has:replicationcontrollers "busybox1" pausing is not supported Successful message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}' error: replicationcontrollers "busybox0" resuming is not supported error: replicationcontrollers "busybox1" resuming is not supported has:Object 'Kind' is missing Successful message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}' error: replicationcontrollers "busybox0" resuming is not supported error: replicationcontrollers "busybox1" resuming is not supported has:replicationcontrollers "busybox0" resuming is not supported Successful message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}' error: replicationcontrollers "busybox0" resuming is not supported error: replicationcontrollers "busybox1" resuming is not supported has:replicationcontrollers "busybox1" resuming is not supported warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely. replicationcontroller "busybox0" force deleted replicationcontroller "busybox1" force deleted error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}' Recording: run_namespace_tests Running command: run_namespace_tests +++ Running case: test-cmd.run_namespace_tests +++ working dir: /home/prow/go/src/k8s.io/kubernetes +++ command: run_namespace_tests +++ [0113 12:40:37] Testing kubectl(v1:namespaces) Successful message:Error from server (NotFound): namespaces "my-namespace" not found has: not found namespace/my-namespace created (dry run) namespace/my-namespace created (server dry run) Successful message:Error from server (NotFound): namespaces "my-namespace" not found has: not found namespace/my-namespace created [32mcore.sh:1459: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace (B[mnamespace "my-namespace" deleted E0113 12:40:41.600259 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource E0113 12:40:42.142242 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource namespace/my-namespace condition met I0113 12:40:42.899377 58414 shared_informer.go:240] Waiting for caches to sync for garbage collector I0113 12:40:42.899439 58414 shared_informer.go:247] Caches are synced for garbage collector Successful message:Error from server (NotFound): namespaces "my-namespace" not found has: not found namespace/my-namespace created [32mcore.sh:1468: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace (B[mI0113 12:40:43.170747 58414 shared_informer.go:240] Waiting for caches to sync for resource quota I0113 12:40:43.170801 58414 shared_informer.go:247] Caches are synced for resource quota Successful ... skipping 33 lines ... namespace "namespace-1610541589-2833" deleted namespace "namespace-1610541589-9569" deleted namespace "namespace-1610541591-26747" deleted namespace "namespace-1610541593-26415" deleted namespace "namespace-1610541595-31340" deleted namespace "namespace-1610541625-8318" deleted Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted has:warning: deleting cluster-scoped resources Successful message:warning: deleting cluster-scoped resources, not scoped to the provided namespace namespace "kube-node-lease" deleted namespace "my-namespace" deleted namespace "namespace-1610541429-10246" deleted ... skipping 29 lines ... namespace "namespace-1610541589-2833" deleted namespace "namespace-1610541589-9569" deleted namespace "namespace-1610541591-26747" deleted namespace "namespace-1610541593-26415" deleted namespace "namespace-1610541595-31340" deleted namespace "namespace-1610541625-8318" deleted Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted has:namespace "my-namespace" deleted namespace/quotas created E0113 12:40:43.451508 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource [32mcore.sh:1475: Successful get namespaces/quotas {{.metadata.name}}: quotas (B[m[32mcore.sh:1476: Successful get quota --namespace=quotas {{range.items}}{{ if eq .metadata.name \"test-quota\" }}found{{end}}{{end}}:: : (B[mresourcequota/test-quota created (dry run) resourcequota/test-quota created (server dry run) [32mcore.sh:1480: Successful get quota --namespace=quotas {{range.items}}{{ if eq .metadata.name \"test-quota\" }}found{{end}}{{end}}:: : (B[mresourcequota/test-quota created [32mcore.sh:1483: Successful get quota --namespace=quotas {{range.items}}{{ if eq .metadata.name \"test-quota\" }}found{{end}}{{end}}:: found: (B[mI0113 12:40:44.230589 58414 resource_quota_controller.go:307] Resource quota has been deleted quotas/test-quota resourcequota "test-quota" deleted namespace "quotas" deleted E0113 12:40:44.634548 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource I0113 12:40:46.051410 58414 horizontal.go:359] Horizontal Pod Autoscaler busybox0 has been deleted in namespace-1610541625-8318 I0113 12:40:46.055044 58414 horizontal.go:359] Horizontal Pod Autoscaler busybox1 has been deleted in namespace-1610541625-8318 [32mcore.sh:1495: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"other\" }}found{{end}}{{end}}:: : (B[mnamespace/other created [32mcore.sh:1499: Successful get namespaces/other {{.metadata.name}}: other (B[m[32mcore.sh:1503: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: (B[mpod/valid-pod created [32mcore.sh:1507: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod: (B[m[32mcore.sh:1509: Successful get pods -n other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod: (B[mSuccessful message:error: a resource cannot be retrieved by name across all namespaces has:a resource cannot be retrieved by name across all namespaces [32mcore.sh:1516: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod: (B[mwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely. pod "valid-pod" force deleted [32mcore.sh:1520: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: (B[mnamespace "other" deleted ... skipping 36 lines ... I0113 12:40:54.231068 58414 namespace_controller.go:185] Namespace has been deleted namespace-1610541586-31708 I0113 12:40:54.238783 58414 namespace_controller.go:185] Namespace has been deleted namespace-1610541593-26415 I0113 12:40:54.241116 58414 namespace_controller.go:185] Namespace has been deleted namespace-1610541591-26747 I0113 12:40:54.249394 58414 namespace_controller.go:185] Namespace has been deleted namespace-1610541595-31340 I0113 12:40:54.292047 58414 namespace_controller.go:185] Namespace has been deleted namespace-1610541625-8318 I0113 12:40:54.415989 58414 namespace_controller.go:185] Namespace has been deleted quotas E0113 12:40:54.915011 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource +++ exit code: 0 E0113 12:40:55.829553 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource Recording: run_secrets_test Running command: run_secrets_test +++ Running case: test-cmd.run_secrets_test +++ working dir: /home/prow/go/src/k8s.io/kubernetes +++ command: run_secrets_test ... skipping 71 lines ... (B[m[32mcore.sh:911: Successful get secret/secret-string-data --namespace=test-secrets {{.stringData}}: <no value> (B[msecret "secret-string-data" deleted [32mcore.sh:920: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: (B[msecret "test-secret" deleted namespace "test-secrets" deleted I0113 12:41:00.775297 58414 namespace_controller.go:185] Namespace has been deleted other E0113 12:41:02.907350 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource +++ exit code: 0 Recording: run_configmap_tests Running command: run_configmap_tests +++ Running case: test-cmd.run_configmap_tests +++ working dir: /home/prow/go/src/k8s.io/kubernetes ... skipping 17 lines ... configmap/test-binary-configmap created [32mcore.sh:51: Successful get configmap/test-configmap --namespace=test-configmaps {{.metadata.name}}: test-configmap (B[m[32mcore.sh:52: Successful get configmap/test-binary-configmap --namespace=test-configmaps {{.metadata.name}}: test-binary-configmap (B[mconfigmap "test-configmap" deleted configmap "test-binary-configmap" deleted namespace "test-configmaps" deleted E0113 12:41:07.614103 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource I0113 12:41:09.915767 58414 namespace_controller.go:185] Namespace has been deleted test-secrets +++ exit code: 0 Recording: run_client_config_tests Running command: run_client_config_tests +++ Running case: test-cmd.run_client_config_tests +++ working dir: /home/prow/go/src/k8s.io/kubernetes +++ command: run_client_config_tests +++ [0113 12:41:12] Creating namespace namespace-1610541672-27495 namespace/namespace-1610541672-27495 created Context "test" modified. +++ [0113 12:41:12] Testing client config Successful message:error: stat missing: no such file or directory has:missing: no such file or directory Successful message:error: stat missing: no such file or directory has:missing: no such file or directory Successful message:error: stat missing: no such file or directory has:missing: no such file or directory Successful message:Error in configuration: context was not found for specified context: missing-context has:context was not found for specified context: missing-context Successful message:error: no server found for cluster "missing-cluster" has:no server found for cluster "missing-cluster" Successful message:error: auth info "missing-user" does not exist has:auth info "missing-user" does not exist Successful message:error: error loading config file "/tmp/newconfig.yaml": no kind "Config" is registered for version "v-1" in scheme "k8s.io/client-go/tools/clientcmd/api/latest/latest.go:50" has:error loading config file Successful message:error: stat missing-config: no such file or directory has:no such file or directory +++ exit code: 0 Recording: run_service_accounts_tests Running command: run_service_accounts_tests +++ Running case: test-cmd.run_service_accounts_tests ... skipping 46 lines ... Labels: <none> Annotations: <none> Schedule: 59 23 31 2 * Concurrency Policy: Allow Suspend: False Successful Job History Limit: 3 Failed Job History Limit: 1 Starting Deadline Seconds: <unset> Selector: <unset> Parallelism: <unset> Completions: <unset> Pod Template: Labels: <none> ... skipping 38 lines ... Labels: controller-uid=2f3159a8-3a5d-432a-932c-ec4efaa76ef7 job-name=test-job Annotations: cronjob.kubernetes.io/instantiate: manual Parallelism: 1 Completions: 1 Start Time: Wed, 13 Jan 2021 12:41:21 +0000 Pods Statuses: 1 Running / 0 Succeeded / 0 Failed Pod Template: Labels: controller-uid=2f3159a8-3a5d-432a-932c-ec4efaa76ef7 job-name=test-job Containers: pi: Image: k8s.gcr.io/perl ... skipping 494 lines ... type: ClusterIP status: loadBalancer: {} Successful message:kubectl-create kubectl-set has:kubectl-set error: you must specify resources by --filename when --local is set. Example resource specifications include: '-f rsrc.yaml' '--filename=rsrc.json' [32mcore.sh:1020: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend: (B[mI0113 12:41:32.265428 58414 namespace_controller.go:185] Namespace has been deleted test-jobs service/redis-master selector updated Successful message:Error from server (Conflict): Operation cannot be fulfilled on services "redis-master": the object has been modified; please apply your changes to the latest version and try again has:Conflict [32mcore.sh:1033: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master: (B[mservice "redis-master" deleted [32mcore.sh:1040: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes: (B[m[32mcore.sh:1044: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes: (B[mservice/redis-master created E0113 12:41:33.218617 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource [32mcore.sh:1048: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master: (B[m[32mcore.sh:1052: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master: (B[mservice/service-v1-test created [32mcore.sh:1073: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:service-v1-test: (B[mservice/service-v1-test replaced [32mcore.sh:1080: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:service-v1-test: ... skipping 93 lines ... (B[mdaemonset.apps/bind created [32mapps.sh:73: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[deprecated.daemonset.template.generation:1 kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"DaemonSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=https://127.0.0.1:6443 --insecure-skip-tls-verify=true --match-server-version=true"},"labels":{"service":"bind"},"name":"bind","namespace":"namespace-1610541700-23884"},"spec":{"selector":{"matchLabels":{"service":"bind"}},"template":{"metadata":{"labels":{"service":"bind"}},"spec":{"affinity":{"podAntiAffinity":{"requiredDuringSchedulingIgnoredDuringExecution":[{"labelSelector":{"matchExpressions":[{"key":"service","operator":"In","values":["bind"]}]},"namespaces":[],"topologyKey":"kubernetes.io/hostname"}]}},"containers":[{"image":"k8s.gcr.io/pause:2.0","name":"kubernetes-pause"}]}},"updateStrategy":{"rollingUpdate":{"maxUnavailable":"10%"},"type":"RollingUpdate"}}} kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=https://127.0.0.1:6443 --insecure-skip-tls-verify=true --match-server-version=true]: (B[mdaemonset.apps/bind skipped rollback (current template already matches revision 1) [32mapps.sh:76: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0: (B[m[32mapps.sh:77: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1 (B[mE0113 12:41:40.973678 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource daemonset.apps/bind configured [32mapps.sh:80: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest: (B[m[32mapps.sh:81: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd: (B[m[32mapps.sh:82: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2 (B[m[32mapps.sh:83: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[deprecated.daemonset.template.generation:2 kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"DaemonSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=https://127.0.0.1:6443 --insecure-skip-tls-verify=true --match-server-version=true"},"labels":{"service":"bind"},"name":"bind","namespace":"namespace-1610541700-23884"},"spec":{"selector":{"matchLabels":{"service":"bind"}},"template":{"metadata":{"labels":{"service":"bind"}},"spec":{"affinity":{"podAntiAffinity":{"requiredDuringSchedulingIgnoredDuringExecution":[{"labelSelector":{"matchExpressions":[{"key":"service","operator":"In","values":["bind"]}]},"namespaces":[],"topologyKey":"kubernetes.io/hostname"}]}},"containers":[{"image":"k8s.gcr.io/pause:latest","name":"kubernetes-pause"},{"image":"k8s.gcr.io/nginx:test-cmd","name":"app"}]}},"updateStrategy":{"rollingUpdate":{"maxUnavailable":"10%"},"type":"RollingUpdate"}}} kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=https://127.0.0.1:6443 --insecure-skip-tls-verify=true --match-server-version=true]:map[deprecated.daemonset.template.generation:1 kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"DaemonSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=https://127.0.0.1:6443 --insecure-skip-tls-verify=true --match-server-version=true"},"labels":{"service":"bind"},"name":"bind","namespace":"namespace-1610541700-23884"},"spec":{"selector":{"matchLabels":{"service":"bind"}},"template":{"metadata":{"labels":{"service":"bind"}},"spec":{"affinity":{"podAntiAffinity":{"requiredDuringSchedulingIgnoredDuringExecution":[{"labelSelector":{"matchExpressions":[{"key":"service","operator":"In","values":["bind"]}]},"namespaces":[],"topologyKey":"kubernetes.io/hostname"}]}},"containers":[{"image":"k8s.gcr.io/pause:2.0","name":"kubernetes-pause"}]}},"updateStrategy":{"rollingUpdate":{"maxUnavailable":"10%"},"type":"RollingUpdate"}}} ... skipping 14 lines ... (B[m[32mapps.sh:88: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd: (B[m[32mapps.sh:89: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2 (B[mdaemonset.apps/bind rolled back [32mapps.sh:92: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0: (B[m[32mapps.sh:93: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1 (B[mSuccessful message:error: unable to find specified revision 1000000 in history has:unable to find specified revision [32mapps.sh:97: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0: (B[m[32mapps.sh:98: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1 (B[mdaemonset.apps/bind rolled back E0113 12:41:42.917436 58414 daemon_controller.go:320] namespace-1610541700-23884/bind failed with : error storing status for daemon set &v1.DaemonSet{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"bind", GenerateName:"", Namespace:"namespace-1610541700-23884", SelfLink:"", UID:"25b5079c-696d-4e4c-9959-e1300f42cb85", ResourceVersion:"2002", Generation:4, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63746138500, loc:(*time.Location)(0x6f98580)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string{"deprecated.daemonset.template.generation":"4", "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"DaemonSet\",\"metadata\":{\"annotations\":{\"kubernetes.io/change-cause\":\"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=https://127.0.0.1:6443 --insecure-skip-tls-verify=true --match-server-version=true\"},\"labels\":{\"service\":\"bind\"},\"name\":\"bind\",\"namespace\":\"namespace-1610541700-23884\"},\"spec\":{\"selector\":{\"matchLabels\":{\"service\":\"bind\"}},\"template\":{\"metadata\":{\"labels\":{\"service\":\"bind\"}},\"spec\":{\"affinity\":{\"podAntiAffinity\":{\"requiredDuringSchedulingIgnoredDuringExecution\":[{\"labelSelector\":{\"matchExpressions\":[{\"key\":\"service\",\"operator\":\"In\",\"values\":[\"bind\"]}]},\"namespaces\":[],\"topologyKey\":\"kubernetes.io/hostname\"}]}},\"containers\":[{\"image\":\"k8s.gcr.io/pause:latest\",\"name\":\"kubernetes-pause\"},{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"app\"}]}},\"updateStrategy\":{\"rollingUpdate\":{\"maxUnavailable\":\"10%\"},\"type\":\"RollingUpdate\"}}}\n", "kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=https://127.0.0.1:6443 --insecure-skip-tls-verify=true --match-server-version=true"}, OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"kube-controller-manager", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc001507580), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc0015075a0)}, v1.ManagedFieldsEntry{Manager:"kubectl-client-side-apply", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc0015075c0), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc0015075e0)}, v1.ManagedFieldsEntry{Manager:"kubectl", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc001507620), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc001507660)}}}, Spec:v1.DaemonSetSpec{Selector:(*v1.LabelSelector)(0xc0015076c0), Template:v1.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"kubernetes-pause", Image:"k8s.gcr.io/pause:latest", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}, v1.Container{Name:"app", Image:"k8s.gcr.io/nginx:test-cmd", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc003247ac8), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc0007a01c0), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(0xc0015076e0), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration(nil), HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(nil), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil), SetHostnameAsFQDN:(*bool)(nil)}}, UpdateStrategy:v1.DaemonSetUpdateStrategy{Type:"RollingUpdate", RollingUpdate:(*v1.RollingUpdateDaemonSet)(0xc00304e1a0)}, MinReadySeconds:0, RevisionHistoryLimit:(*int32)(0xc003247b1c)}, Status:v1.DaemonSetStatus{CurrentNumberScheduled:0, NumberMisscheduled:0, DesiredNumberScheduled:0, NumberReady:0, ObservedGeneration:3, UpdatedNumberScheduled:0, NumberAvailable:0, NumberUnavailable:0, CollisionCount:(*int32)(nil), Conditions:[]v1.DaemonSetCondition(nil)}}: Operation cannot be fulfilled on daemonsets.apps "bind": the object has been modified; please apply your changes to the latest version and try again [32mapps.sh:101: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest: (B[m[32mapps.sh:102: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd: (B[m[32mapps.sh:103: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2 (B[mdaemonset.apps "bind" deleted +++ exit code: 0 Recording: run_rc_tests ... skipping 32 lines ... Namespace: namespace-1610541703-750 Selector: app=guestbook,tier=frontend Labels: app=guestbook tier=frontend Annotations: <none> Replicas: 3 current / 3 desired Pods Status: 0 Running / 3 Waiting / 0 Succeeded / 0 Failed Pod Template: Labels: app=guestbook tier=frontend Containers: php-redis: Image: gcr.io/google_samples/gb-frontend:v4 ... skipping 17 lines ... Namespace: namespace-1610541703-750 Selector: app=guestbook,tier=frontend Labels: app=guestbook tier=frontend Annotations: <none> Replicas: 3 current / 3 desired Pods Status: 0 Running / 3 Waiting / 0 Succeeded / 0 Failed Pod Template: Labels: app=guestbook tier=frontend Containers: php-redis: Image: gcr.io/google_samples/gb-frontend:v4 ... skipping 18 lines ... Namespace: namespace-1610541703-750 Selector: app=guestbook,tier=frontend Labels: app=guestbook tier=frontend Annotations: <none> Replicas: 3 current / 3 desired Pods Status: 0 Running / 3 Waiting / 0 Succeeded / 0 Failed Pod Template: Labels: app=guestbook tier=frontend Containers: php-redis: Image: gcr.io/google_samples/gb-frontend:v4 ... skipping 12 lines ... Namespace: namespace-1610541703-750 Selector: app=guestbook,tier=frontend Labels: app=guestbook tier=frontend Annotations: <none> Replicas: 3 current / 3 desired Pods Status: 0 Running / 3 Waiting / 0 Succeeded / 0 Failed Pod Template: Labels: app=guestbook tier=frontend Containers: php-redis: Image: gcr.io/google_samples/gb-frontend:v4 ... skipping 27 lines ... Namespace: namespace-1610541703-750 Selector: app=guestbook,tier=frontend Labels: app=guestbook tier=frontend Annotations: <none> Replicas: 3 current / 3 desired Pods Status: 0 Running / 3 Waiting / 0 Succeeded / 0 Failed Pod Template: Labels: app=guestbook tier=frontend Containers: php-redis: Image: gcr.io/google_samples/gb-frontend:v4 ... skipping 17 lines ... Namespace: namespace-1610541703-750 Selector: app=guestbook,tier=frontend Labels: app=guestbook tier=frontend Annotations: <none> Replicas: 3 current / 3 desired Pods Status: 0 Running / 3 Waiting / 0 Succeeded / 0 Failed Pod Template: Labels: app=guestbook tier=frontend Containers: php-redis: Image: gcr.io/google_samples/gb-frontend:v4 ... skipping 9 lines ... Events: Type Reason Age From Message ---- ------ ---- ---- ------- Normal SuccessfulCreate 1s replication-controller Created pod: frontend-z2l9z Normal SuccessfulCreate 1s replication-controller Created pod: frontend-5m9m8 Normal SuccessfulCreate 1s replication-controller Created pod: frontend-w6kdj (B[mE0113 12:41:45.409394 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource [32mSuccessful describe Name: frontend Namespace: namespace-1610541703-750 Selector: app=guestbook,tier=frontend Labels: app=guestbook tier=frontend Annotations: <none> Replicas: 3 current / 3 desired Pods Status: 0 Running / 3 Waiting / 0 Succeeded / 0 Failed Pod Template: Labels: app=guestbook tier=frontend Containers: php-redis: Image: gcr.io/google_samples/gb-frontend:v4 ... skipping 11 lines ... Namespace: namespace-1610541703-750 Selector: app=guestbook,tier=frontend Labels: app=guestbook tier=frontend Annotations: <none> Replicas: 3 current / 3 desired Pods Status: 0 Running / 3 Waiting / 0 Succeeded / 0 Failed Pod Template: Labels: app=guestbook tier=frontend Containers: php-redis: Image: gcr.io/google_samples/gb-frontend:v4 ... skipping 15 lines ... (B[m[32mcore.sh:1224: Successful get rc frontend {{.spec.replicas}}: 3 (B[mreplicationcontroller/frontend scaled E0113 12:41:45.725019 58414 replica_set.go:201] ReplicaSet has no controller: &ReplicaSet{ObjectMeta:{frontend namespace-1610541703-750 a4bc9a42-1394-40a1-b464-95f722ccc969 2039 2 2021-01-13 12:41:44 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] [] [{kube-controller-manager Update v1 2021-01-13 12:41:44 +0000 UTC FieldsV1 {"f:status":{"f:fullyLabeledReplicas":{},"f:observedGeneration":{},"f:replicas":{}}}} {kubectl-create Update v1 2021-01-13 12:41:44 +0000 UTC FieldsV1 {"f:metadata":{"f:labels":{".":{},"f:app":{},"f:tier":{}}},"f:spec":{"f:replicas":{},"f:selector":{".":{},"f:app":{},"f:tier":{}},"f:template":{".":{},"f:metadata":{".":{},"f:creationTimestamp":{},"f:labels":{".":{},"f:app":{},"f:tier":{}}},"f:spec":{".":{},"f:containers":{".":{},"k:{\"name\":\"php-redis\"}":{".":{},"f:env":{".":{},"k:{\"name\":\"GET_HOSTS_FROM\"}":{".":{},"f:name":{},"f:value":{}}},"f:image":{},"f:imagePullPolicy":{},"f:name":{},"f:ports":{".":{},"k:{\"containerPort\":80,\"protocol\":\"TCP\"}":{".":{},"f:containerPort":{},"f:protocol":{}}},"f:resources":{".":{},"f:requests":{".":{},"f:cpu":{},"f:memory":{}}},"f:terminationMessagePath":{},"f:terminationMessagePolicy":{}}},"f:dnsPolicy":{},"f:restartPolicy":{},"f:schedulerName":{},"f:securityContext":{},"f:terminationGracePeriodSeconds":{}}}}}}]},Spec:ReplicaSetSpec{Replicas:*2,Selector:&v1.LabelSelector{MatchLabels:map[string]string{app: guestbook,tier: frontend,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{ 0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] [] []} {[] [] [{php-redis gcr.io/google_samples/gb-frontend:v4 [] [] [{ 0 80 TCP }] [] [{GET_HOSTS_FROM dns nil}] {map[] map[cpu:{{100 -3} {<nil>} 100m DecimalSI} memory:{{104857600 0} {<nil>} 100Mi BinarySI}]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent nil false false false}] [] Always 0xc002c0ccd8 <nil> ClusterFirst map[] <nil> false false false <nil> PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,FSGroupChangePolicy:nil,SeccompProfile:nil,} [] nil default-scheduler [] [] <nil> nil [] <nil> <nil> <nil> map[] [] <nil>}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:3,FullyLabeledReplicas:3,ObservedGeneration:1,ReadyReplicas:0,AvailableReplicas:0,Conditions:[]ReplicaSetCondition{},},} I0113 12:41:45.731872 58414 event.go:291] "Event occurred" object="namespace-1610541703-750/frontend" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulDelete" message="Deleted pod: frontend-w6kdj" [32mcore.sh:1228: Successful get rc frontend {{.spec.replicas}}: 2 (B[m[32mcore.sh:1232: Successful get rc frontend {{.spec.replicas}}: 2 (B[merror: Expected replicas to be 3, was 2 [32mcore.sh:1236: Successful get rc frontend {{.spec.replicas}}: 2 (B[m[32mcore.sh:1240: Successful get rc frontend {{.spec.replicas}}: 2 (B[mreplicationcontroller/frontend scaled I0113 12:41:46.315378 58414 event.go:291] "Event occurred" object="namespace-1610541703-750/frontend" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-6sjcb" [32mcore.sh:1244: Successful get rc frontend {{.spec.replicas}}: 3 (B[m[32mcore.sh:1248: Successful get rc frontend {{.spec.replicas}}: 3 ... skipping 15 lines ... I0113 12:41:47.309917 58414 event.go:291] "Event occurred" object="namespace-1610541703-750/redis-slave" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: redis-slave-lnnds" I0113 12:41:47.314269 58414 event.go:291] "Event occurred" object="namespace-1610541703-750/redis-slave" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: redis-slave-smvxt" [32mcore.sh:1262: Successful get rc redis-master {{.spec.replicas}}: 4 (B[m[32mcore.sh:1263: Successful get rc redis-slave {{.spec.replicas}}: 4 (B[mreplicationcontroller "redis-master" deleted replicationcontroller "redis-slave" deleted E0113 12:41:47.613002 58414 replica_set.go:532] sync "namespace-1610541703-750/redis-slave" failed with replicationcontrollers "redis-slave" not found deployment.apps/nginx-deployment created I0113 12:41:47.800496 58414 event.go:291] "Event occurred" object="namespace-1610541703-750/nginx-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-76b5cd66f5 to 3" I0113 12:41:47.806939 58414 event.go:291] "Event occurred" object="namespace-1610541703-750/nginx-deployment-76b5cd66f5" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-76b5cd66f5-tbrl5" I0113 12:41:47.811439 58414 event.go:291] "Event occurred" object="namespace-1610541703-750/nginx-deployment-76b5cd66f5" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-76b5cd66f5-zm7gm" I0113 12:41:47.811474 58414 event.go:291] "Event occurred" object="namespace-1610541703-750/nginx-deployment-76b5cd66f5" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-76b5cd66f5-gqmn8" deployment.apps/nginx-deployment scaled ... skipping 4 lines ... (B[mdeployment.apps "nginx-deployment" deleted Successful message:service/expose-test-deployment exposed has:service/expose-test-deployment exposed service "expose-test-deployment" deleted Successful message:error: couldn't retrieve selectors via --selector flag or introspection: invalid deployment: no selectors, therefore cannot be exposed See 'kubectl expose -h' for help and examples has:invalid deployment: no selectors deployment.apps/nginx-deployment created I0113 12:41:48.574684 58414 event.go:291] "Event occurred" object="namespace-1610541703-750/nginx-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-76b5cd66f5 to 3" I0113 12:41:48.579367 58414 event.go:291] "Event occurred" object="namespace-1610541703-750/nginx-deployment-76b5cd66f5" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-76b5cd66f5-lckvl" I0113 12:41:48.582114 58414 event.go:291] "Event occurred" object="namespace-1610541703-750/nginx-deployment-76b5cd66f5" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-76b5cd66f5-b48fv" ... skipping 23 lines ... service "frontend" deleted service "frontend-2" deleted service "frontend-3" deleted service "frontend-4" deleted service "frontend-5" deleted Successful message:error: cannot expose a Node has:cannot expose Successful message:The Service "invalid-large-service-name-that-has-more-than-sixty-three-characters" is invalid: metadata.name: Invalid value: "invalid-large-service-name-that-has-more-than-sixty-three-characters": must be no more than 63 characters has:metadata.name: Invalid value Successful message:service/kubernetes-serve-hostname-testing-sixty-three-characters-in-len exposed ... skipping 30 lines ... (B[mhorizontalpodautoscaler.autoscaling/frontend autoscaled [32mcore.sh:1391: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 70 (B[mhorizontalpodautoscaler.autoscaling "frontend" deleted horizontalpodautoscaler.autoscaling/frontend autoscaled [32mcore.sh:1395: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80 (B[mhorizontalpodautoscaler.autoscaling "frontend" deleted Error: required flag(s) "max" not set replicationcontroller "frontend" deleted [32mcore.sh:1404: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: (B[mapiVersion: apps/v1 kind: Deployment metadata: creationTimestamp: null ... skipping 24 lines ... limits: cpu: 300m requests: cpu: 300m terminationGracePeriodSeconds: 0 status: {} Error from server (NotFound): deployments.apps "nginx-deployment-resources" not found deployment.apps/nginx-deployment-resources created I0113 12:41:54.471664 58414 event.go:291] "Event occurred" object="namespace-1610541703-750/nginx-deployment-resources" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-resources-748ddcb48b to 3" I0113 12:41:54.477615 58414 event.go:291] "Event occurred" object="namespace-1610541703-750/nginx-deployment-resources-748ddcb48b" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-resources-748ddcb48b-xtxh7" I0113 12:41:54.481607 58414 event.go:291] "Event occurred" object="namespace-1610541703-750/nginx-deployment-resources-748ddcb48b" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-resources-748ddcb48b-28s9p" I0113 12:41:54.483788 58414 event.go:291] "Event occurred" object="namespace-1610541703-750/nginx-deployment-resources-748ddcb48b" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-resources-748ddcb48b-zjn2h" [32mcore.sh:1410: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment-resources: (B[m[32mcore.sh:1411: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd: (B[m[32mcore.sh:1412: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl: (B[mdeployment.apps/nginx-deployment-resources resource requirements updated I0113 12:41:54.890326 58414 event.go:291] "Event occurred" object="namespace-1610541703-750/nginx-deployment-resources" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-resources-7bfb7d56b6 to 1" I0113 12:41:54.896767 58414 event.go:291] "Event occurred" object="namespace-1610541703-750/nginx-deployment-resources-7bfb7d56b6" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-resources-7bfb7d56b6-wh6z5" E0113 12:41:54.934014 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource [32mcore.sh:1415: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 100m: (B[m[32mcore.sh:1416: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m: (B[merror: unable to find container named redis deployment.apps/nginx-deployment-resources resource requirements updated I0113 12:41:55.373286 58414 event.go:291] "Event occurred" object="namespace-1610541703-750/nginx-deployment-resources" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled down replica set nginx-deployment-resources-748ddcb48b to 2" I0113 12:41:55.379973 58414 event.go:291] "Event occurred" object="namespace-1610541703-750/nginx-deployment-resources-748ddcb48b" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulDelete" message="Deleted pod: nginx-deployment-resources-748ddcb48b-28s9p" I0113 12:41:55.387383 58414 event.go:291] "Event occurred" object="namespace-1610541703-750/nginx-deployment-resources" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-resources-75dbcccf44 to 1" I0113 12:41:55.393578 58414 event.go:291] "Event occurred" object="namespace-1610541703-750/nginx-deployment-resources-75dbcccf44" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-resources-75dbcccf44-qs8bp" [32mcore.sh:1421: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m: ... skipping 390 lines ... status: "True" type: Progressing observedGeneration: 4 replicas: 4 unavailableReplicas: 4 updatedReplicas: 1 error: you must specify resources by --filename when --local is set. Example resource specifications include: '-f rsrc.yaml' '--filename=rsrc.json' [32mcore.sh:1432: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m: (B[m[32mcore.sh:1433: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 300m: (B[m[32mcore.sh:1434: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.requests.cpu}}:{{end}}: 300m: ... skipping 46 lines ... pod-template-hash=69dd6dcd84 Annotations: deployment.kubernetes.io/desired-replicas: 1 deployment.kubernetes.io/max-replicas: 2 deployment.kubernetes.io/revision: 1 Controlled By: Deployment/test-nginx-apps Replicas: 1 current / 1 desired Pods Status: 0 Running / 1 Waiting / 0 Succeeded / 0 Failed Pod Template: Labels: app=test-nginx-apps pod-template-hash=69dd6dcd84 Containers: nginx: Image: k8s.gcr.io/nginx:test-cmd ... skipping 102 lines ... [32mapps.sh:305: Successful get deployment.apps {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9: (B[m Image: k8s.gcr.io/nginx:test-cmd deployment.apps/nginx rolled back (server dry run) [32mapps.sh:309: Successful get deployment.apps {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9: (B[mdeployment.apps/nginx rolled back [32mapps.sh:313: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd: (B[merror: unable to find specified revision 1000000 in history [32mapps.sh:316: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd: (B[mdeployment.apps/nginx rolled back [32mapps.sh:320: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9: (B[mdeployment.apps/nginx paused error: you cannot rollback a paused deployment; resume it first with 'kubectl rollout resume deployment/nginx' and try again error: deployments.apps "nginx" can't restart paused deployment (run rollout resume first) deployment.apps/nginx resumed deployment.apps/nginx rolled back deployment.kubernetes.io/revision-history: 1,3 error: desired revision (3) is different from the running revision (5) deployment.apps/nginx restarted I0113 12:42:07.096690 58414 event.go:291] "Event occurred" object="namespace-1610541716-14791/nginx" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled down replica set nginx-54785cbcb8 to 2" I0113 12:42:07.104453 58414 event.go:291] "Event occurred" object="namespace-1610541716-14791/nginx-54785cbcb8" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulDelete" message="Deleted pod: nginx-54785cbcb8-5cbdv" I0113 12:42:07.107110 58414 event.go:291] "Event occurred" object="namespace-1610541716-14791/nginx" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-5465bb8497 to 1" I0113 12:42:07.114290 58414 event.go:291] "Event occurred" object="namespace-1610541716-14791/nginx-5465bb8497" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-5465bb8497-vs8cl" Successful ... skipping 148 lines ... (B[m[32mapps.sh:364: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl: (B[mdeployment.apps/nginx-deployment image updated I0113 12:42:10.139689 58414 event.go:291] "Event occurred" object="namespace-1610541716-14791/nginx-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-6dd48b9849 to 1" I0113 12:42:10.143466 58414 event.go:291] "Event occurred" object="namespace-1610541716-14791/nginx-deployment-6dd48b9849" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-6dd48b9849-lhrvp" [32mapps.sh:367: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9: (B[m[32mapps.sh:368: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl: (B[merror: unable to find container named "redis" deployment.apps/nginx-deployment image updated [32mapps.sh:373: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd: (B[m[32mapps.sh:374: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl: (B[mdeployment.apps/nginx-deployment image updated [32mapps.sh:377: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9: (B[m[32mapps.sh:378: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl: ... skipping 50 lines ... I0113 12:42:14.625720 58414 event.go:291] "Event occurred" object="namespace-1610541716-14791/nginx-deployment-59b7fccd97" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulDelete" message="Deleted pod: nginx-deployment-59b7fccd97-2sbdf" I0113 12:42:14.669464 58414 event.go:291] "Event occurred" object="namespace-1610541716-14791/nginx-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-57ddd474c4 to 1" deployment.apps/nginx-deployment env updated deployment.apps "nginx-deployment" deleted I0113 12:42:14.822515 58414 event.go:291] "Event occurred" object="namespace-1610541716-14791/nginx-deployment-57ddd474c4" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-57ddd474c4-d29fg" configmap "test-set-env-config" deleted E0113 12:42:14.869948 58414 replica_set.go:532] sync "namespace-1610541716-14791/nginx-deployment-68d657fb6" failed with replicasets.apps "nginx-deployment-68d657fb6" not found E0113 12:42:14.923299 58414 replica_set.go:532] sync "namespace-1610541716-14791/nginx-deployment-59b7fccd97" failed with replicasets.apps "nginx-deployment-59b7fccd97" not found secret "test-set-env-secret" deleted +++ exit code: 0 Recording: run_rs_tests Running command: run_rs_tests +++ Running case: test-cmd.run_rs_tests +++ working dir: /home/prow/go/src/k8s.io/kubernetes +++ command: run_rs_tests +++ [0113 12:42:15] Creating namespace namespace-1610541735-29315 namespace/namespace-1610541735-29315 created E0113 12:42:15.119961 58414 replica_set.go:532] sync "namespace-1610541716-14791/nginx-deployment-5f8c874568" failed with replicasets.apps "nginx-deployment-5f8c874568" not found E0113 12:42:15.170435 58414 replica_set.go:532] sync "namespace-1610541716-14791/nginx-deployment-57ddd474c4" failed with replicasets.apps "nginx-deployment-57ddd474c4" not found Context "test" modified. +++ [0113 12:42:15] Testing kubectl(v1:replicasets) [32mapps.sh:541: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: (B[mreplicaset.apps/frontend created I0113 12:42:15.555813 58414 event.go:291] "Event occurred" object="namespace-1610541735-29315/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-bl9nf" I0113 12:42:15.559453 58414 event.go:291] "Event occurred" object="namespace-1610541735-29315/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-n2rgt" I0113 12:42:15.559618 58414 event.go:291] "Event occurred" object="namespace-1610541735-29315/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-x45w7" +++ [0113 12:42:15] Deleting rs replicaset.apps "frontend" deleted E0113 12:42:15.670339 58414 replica_set.go:532] sync "namespace-1610541735-29315/frontend" failed with Operation cannot be fulfilled on replicasets.apps "frontend": StorageError: invalid object, Code: 4, Key: /registry/replicasets/namespace-1610541735-29315/frontend, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 94974e6d-753f-4993-ada7-f4ea4f5e37ad, UID in object meta: [32mapps.sh:547: Successful get pods -l "tier=frontend" {{range.items}}{{.metadata.name}}:{{end}}: (B[m[32mapps.sh:551: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: (B[mreplicaset.apps/frontend created I0113 12:42:16.047884 58414 event.go:291] "Event occurred" object="namespace-1610541735-29315/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-7zsrr" I0113 12:42:16.051814 58414 event.go:291] "Event occurred" object="namespace-1610541735-29315/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-6twjd" I0113 12:42:16.051851 58414 event.go:291] "Event occurred" object="namespace-1610541735-29315/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-xdp8m" [32mapps.sh:555: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis: (B[m+++ [0113 12:42:16] Deleting rs replicaset.apps "frontend" deleted E0113 12:42:16.272953 58414 replica_set.go:532] sync "namespace-1610541735-29315/frontend" failed with replicasets.apps "frontend" not found [32mapps.sh:559: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: (B[m[32mapps.sh:561: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis: (B[mpod "frontend-6twjd" deleted pod "frontend-7zsrr" deleted pod "frontend-xdp8m" deleted [32mapps.sh:564: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: ... skipping 16 lines ... Namespace: namespace-1610541735-29315 Selector: app=guestbook,tier=frontend Labels: app=guestbook tier=frontend Annotations: <none> Replicas: 3 current / 3 desired Pods Status: 0 Running / 3 Waiting / 0 Succeeded / 0 Failed Pod Template: Labels: app=guestbook tier=frontend Containers: php-redis: Image: gcr.io/google_samples/gb-frontend:v3 ... skipping 9 lines ... Events: Type Reason Age From Message ---- ------ ---- ---- ------- Normal SuccessfulCreate 1s replicaset-controller Created pod: frontend-2hlr5 Normal SuccessfulCreate 1s replicaset-controller Created pod: frontend-88tcz Normal SuccessfulCreate 1s replicaset-controller Created pod: frontend-5cwv9 (B[mE0113 12:42:17.172526 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource [32mapps.sh:576: Successful describe Name: frontend Namespace: namespace-1610541735-29315 Selector: app=guestbook,tier=frontend Labels: app=guestbook tier=frontend Annotations: <none> Replicas: 3 current / 3 desired Pods Status: 0 Running / 3 Waiting / 0 Succeeded / 0 Failed Pod Template: Labels: app=guestbook tier=frontend Containers: php-redis: Image: gcr.io/google_samples/gb-frontend:v3 ... skipping 18 lines ... Namespace: namespace-1610541735-29315 Selector: app=guestbook,tier=frontend Labels: app=guestbook tier=frontend Annotations: <none> Replicas: 3 current / 3 desired Pods Status: 0 Running / 3 Waiting / 0 Succeeded / 0 Failed Pod Template: Labels: app=guestbook tier=frontend Containers: php-redis: Image: gcr.io/google_samples/gb-frontend:v3 ... skipping 12 lines ... Namespace: namespace-1610541735-29315 Selector: app=guestbook,tier=frontend Labels: app=guestbook tier=frontend Annotations: <none> Replicas: 3 current / 3 desired Pods Status: 0 Running / 3 Waiting / 0 Succeeded / 0 Failed Pod Template: Labels: app=guestbook tier=frontend Containers: php-redis: Image: gcr.io/google_samples/gb-frontend:v3 ... skipping 25 lines ... Namespace: namespace-1610541735-29315 Selector: app=guestbook,tier=frontend Labels: app=guestbook tier=frontend Annotations: <none> Replicas: 3 current / 3 desired Pods Status: 0 Running / 3 Waiting / 0 Succeeded / 0 Failed Pod Template: Labels: app=guestbook tier=frontend Containers: php-redis: Image: gcr.io/google_samples/gb-frontend:v3 ... skipping 17 lines ... Namespace: namespace-1610541735-29315 Selector: app=guestbook,tier=frontend Labels: app=guestbook tier=frontend Annotations: <none> Replicas: 3 current / 3 desired Pods Status: 0 Running / 3 Waiting / 0 Succeeded / 0 Failed Pod Template: Labels: app=guestbook tier=frontend Containers: php-redis: Image: gcr.io/google_samples/gb-frontend:v3 ... skipping 17 lines ... Namespace: namespace-1610541735-29315 Selector: app=guestbook,tier=frontend Labels: app=guestbook tier=frontend Annotations: <none> Replicas: 3 current / 3 desired Pods Status: 0 Running / 3 Waiting / 0 Succeeded / 0 Failed Pod Template: Labels: app=guestbook tier=frontend Containers: php-redis: Image: gcr.io/google_samples/gb-frontend:v3 ... skipping 11 lines ... Namespace: namespace-1610541735-29315 Selector: app=guestbook,tier=frontend Labels: app=guestbook tier=frontend Annotations: <none> Replicas: 3 current / 3 desired Pods Status: 0 Running / 3 Waiting / 0 Succeeded / 0 Failed Pod Template: Labels: app=guestbook tier=frontend Containers: php-redis: Image: gcr.io/google_samples/gb-frontend:v3 ... skipping 127 lines ... (B[mdeployment.apps/scale-1 scaled deployment.apps/scale-2 scaled deployment.apps/scale-3 scaled deployment.apps/scale-1 scaled deployment.apps/scale-2 scaled deployment.apps/scale-3 scaled E0113 12:42:19.817325 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource [32mapps.sh:614: Successful get deploy scale-1 {{.spec.replicas}}: 1 (B[m[32mapps.sh:615: Successful get deploy scale-2 {{.spec.replicas}}: 1 (B[m[32mapps.sh:616: Successful get deploy scale-3 {{.spec.replicas}}: 1 (B[mdeployment.apps/scale-1 scaled deployment.apps/scale-2 scaled I0113 12:42:20.190788 58414 event.go:291] "Event occurred" object="namespace-1610541735-29315/scale-1" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set scale-1-6865bdcf4d to 2" I0113 12:42:20.193255 58414 event.go:291] "Event occurred" object="namespace-1610541735-29315/scale-2" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set scale-2-6865bdcf4d to 2" I0113 12:42:20.196309 58414 event.go:291] "Event occurred" object="namespace-1610541735-29315/scale-1-6865bdcf4d" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: scale-1-6865bdcf4d-7npfn" I0113 12:42:20.197318 58414 event.go:291] "Event occurred" object="namespace-1610541735-29315/scale-2-6865bdcf4d" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: scale-2-6865bdcf4d-s7nqg" [32mapps.sh:619: Successful get deploy scale-1 {{.spec.replicas}}: 2 (B[mE0113 12:42:20.316761 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource [32mapps.sh:620: Successful get deploy scale-2 {{.spec.replicas}}: 2 (B[m[32mapps.sh:621: Successful get deploy scale-3 {{.spec.replicas}}: 1 (B[mdeployment.apps/scale-1 scaled deployment.apps/scale-2 scaled I0113 12:42:20.574085 58414 event.go:291] "Event occurred" object="namespace-1610541735-29315/scale-1" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set scale-1-6865bdcf4d to 3" deployment.apps/scale-3 scaled ... skipping 67 lines ... horizontalpodautoscaler.autoscaling/frontend autoscaled [32mapps.sh:706: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80 (B[mSuccessful message:kubectl-autoscale has:kubectl-autoscale horizontalpodautoscaler.autoscaling "frontend" deleted Error: required flag(s) "max" not set replicaset.apps "frontend" deleted +++ exit code: 0 Recording: run_stateful_set_tests Running command: run_stateful_set_tests +++ Running case: test-cmd.run_stateful_set_tests ... skipping 61 lines ... (B[m[32mapps.sh:466: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0: (B[m[32mapps.sh:467: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2 (B[mstatefulset.apps/nginx rolled back [32mapps.sh:470: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7: (B[m[32mapps.sh:471: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1 (B[mSuccessful message:error: unable to find specified revision 1000000 in history has:unable to find specified revision [32mapps.sh:475: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7: (B[m[32mapps.sh:476: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1 (B[mstatefulset.apps/nginx rolled back [32mapps.sh:479: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8: (B[m[32mapps.sh:480: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0: ... skipping 60 lines ... Name: mock Namespace: namespace-1610541752-31990 Selector: app=mock Labels: app=mock Annotations: <none> Replicas: 1 current / 1 desired Pods Status: 0 Running / 1 Waiting / 0 Succeeded / 0 Failed Pod Template: Labels: app=mock Containers: mock-container: Image: k8s.gcr.io/pause:2.0 Port: 9949/TCP ... skipping 58 lines ... Name: mock Namespace: namespace-1610541752-31990 Selector: app=mock Labels: app=mock Annotations: <none> Replicas: 1 current / 1 desired Pods Status: 0 Running / 1 Waiting / 0 Succeeded / 0 Failed Pod Template: Labels: app=mock Containers: mock-container: Image: k8s.gcr.io/pause:2.0 Port: 9949/TCP ... skipping 29 lines ... Testing with file hack/testdata/multi-resource-json.json and replace with file hack/testdata/multi-resource-json-modify.json [32mgeneric-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: (B[m[32mgeneric-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: (B[mservice/mock created replicationcontroller/mock created I0113 12:42:37.485468 58414 event.go:291] "Event occurred" object="namespace-1610541752-31990/mock" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: mock-tb8db" E0113 12:42:37.510445 58414 garbagecollector.go:350] error syncing item &garbagecollector.node{identity:garbagecollector.objectReference{OwnerReference:v1.OwnerReference{APIVersion:"discovery.k8s.io/v1beta1", Kind:"EndpointSlice", Name:"mock-zcjff", UID:"b0961b05-f2aa-4dab-9de9-53a94ba2d942", Controller:(*bool)(nil), BlockOwnerDeletion:(*bool)(nil)}, Namespace:"namespace-1610541752-31990"}, dependentsLock:sync.RWMutex{w:sync.Mutex{state:0, sema:0x0}, writerSem:0x0, readerSem:0x0, readerCount:1, readerWait:0}, dependents:map[*garbagecollector.node]struct {}{}, deletingDependents:false, deletingDependentsLock:sync.RWMutex{w:sync.Mutex{state:0, sema:0x0}, writerSem:0x0, readerSem:0x0, readerCount:0, readerWait:0}, beingDeleted:false, beingDeletedLock:sync.RWMutex{w:sync.Mutex{state:0, sema:0x0}, writerSem:0x0, readerSem:0x0, readerCount:0, readerWait:0}, virtual:false, virtualLock:sync.RWMutex{w:sync.Mutex{state:0, sema:0x0}, writerSem:0x0, readerSem:0x0, readerCount:0, readerWait:0}, owners:[]v1.OwnerReference{v1.OwnerReference{APIVersion:"v1", Kind:"Service", Name:"mock", UID:"8262d389-7f7b-46b7-8ab4-c6f1b44fb88d", Controller:(*bool)(0xc00361adcc), BlockOwnerDeletion:(*bool)(0xc00361adcd)}}}: endpointslices.discovery.k8s.io "mock-zcjff" not found [32mgeneric-resources.sh:72: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock: (B[m[32mgeneric-resources.sh:80: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock: (B[mNAME TYPE CLUSTER-IP EXTERNAL-IP PORT(S) AGE service/mock ClusterIP 10.0.0.186 <none> 99/TCP 0s NAME DESIRED CURRENT READY AGE ... skipping 17 lines ... Name: mock Namespace: namespace-1610541752-31990 Selector: app=mock Labels: app=mock Annotations: <none> Replicas: 1 current / 1 desired Pods Status: 0 Running / 1 Waiting / 0 Succeeded / 0 Failed Pod Template: Labels: app=mock Containers: mock-container: Image: k8s.gcr.io/pause:2.0 Port: 9949/TCP ... skipping 41 lines ... Namespace: namespace-1610541752-31990 Selector: app=mock Labels: app=mock status=replaced Annotations: <none> Replicas: 1 current / 1 desired Pods Status: 0 Running / 1 Waiting / 0 Succeeded / 0 Failed Pod Template: Labels: app=mock Containers: mock-container: Image: k8s.gcr.io/pause:2.0 Port: 9949/TCP ... skipping 11 lines ... Namespace: namespace-1610541752-31990 Selector: app=mock2 Labels: app=mock2 status=replaced Annotations: <none> Replicas: 1 current / 1 desired Pods Status: 0 Running / 1 Waiting / 0 Succeeded / 0 Failed Pod Template: Labels: app=mock2 Containers: mock-container: Image: k8s.gcr.io/pause:2.0 Port: 9949/TCP ... skipping 86 lines ... [32mgeneric-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true (B[m[32mgeneric-resources.sh:155: Successful get services mock2 {{.metadata.annotations.annotated}}: true (B[mservice "mock" deleted service "mock2" deleted [32mgeneric-resources.sh:173: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: (B[m[32mgeneric-resources.sh:174: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: (B[mE0113 12:42:44.966521 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource service/mock created replicationcontroller/mock created I0113 12:42:45.164064 58414 event.go:291] "Event occurred" object="namespace-1610541752-31990/mock" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: mock-9r5sh" [32mgeneric-resources.sh:180: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock: (B[m[32mgeneric-resources.sh:181: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock: (B[mservice "mock" deleted ... skipping 10 lines ... +++ [0113 12:42:45] Creating namespace namespace-1610541765-3617 namespace/namespace-1610541765-3617 created Context "test" modified. +++ [0113 12:42:46] Testing persistent volumes [32mstorage.sh:30: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: (B[mpersistentvolume/pv0001 created E0113 12:42:46.397817 58414 pv_protection_controller.go:118] PV pv0001 failed with : Operation cannot be fulfilled on persistentvolumes "pv0001": the object has been modified; please apply your changes to the latest version and try again [32mstorage.sh:33: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0001: (B[mpersistentvolume "pv0001" deleted persistentvolume/pv0002 created E0113 12:42:46.825274 58414 pv_protection_controller.go:118] PV pv0002 failed with : Operation cannot be fulfilled on persistentvolumes "pv0002": the object has been modified; please apply your changes to the latest version and try again [32mstorage.sh:36: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0002: (B[mpersistentvolume "pv0002" deleted persistentvolume/pv0003 created E0113 12:42:47.261647 58414 pv_protection_controller.go:118] PV pv0003 failed with : Operation cannot be fulfilled on persistentvolumes "pv0003": the object has been modified; please apply your changes to the latest version and try again [32mstorage.sh:39: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0003: (B[mpersistentvolume "pv0003" deleted [32mstorage.sh:42: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: (B[mpersistentvolume/pv0001 created [32mstorage.sh:45: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0001: (B[mSuccessful ... skipping 71 lines ... Roles: <none> Labels: <none> Annotations: node.alpha.kubernetes.io/ttl: 0 CreationTimestamp: Wed, 13 Jan 2021 12:37:07 +0000 Taints: node.kubernetes.io/unreachable:NoSchedule Unschedulable: false Lease: Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found Conditions: Type Status LastHeartbeatTime LastTransitionTime Reason Message ---- ------ ----------------- ------------------ ------ ------- Ready Unknown Wed, 13 Jan 2021 12:37:07 +0000 Wed, 13 Jan 2021 12:38:08 +0000 NodeStatusNeverUpdated Kubelet never posted node status. MemoryPressure Unknown Wed, 13 Jan 2021 12:37:07 +0000 Wed, 13 Jan 2021 12:38:08 +0000 NodeStatusNeverUpdated Kubelet never posted node status. DiskPressure Unknown Wed, 13 Jan 2021 12:37:07 +0000 Wed, 13 Jan 2021 12:38:08 +0000 NodeStatusNeverUpdated Kubelet never posted node status. ... skipping 30 lines ... Roles: <none> Labels: <none> Annotations: node.alpha.kubernetes.io/ttl: 0 CreationTimestamp: Wed, 13 Jan 2021 12:37:07 +0000 Taints: node.kubernetes.io/unreachable:NoSchedule Unschedulable: false Lease: Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found Conditions: Type Status LastHeartbeatTime LastTransitionTime Reason Message ---- ------ ----------------- ------------------ ------ ------- Ready Unknown Wed, 13 Jan 2021 12:37:07 +0000 Wed, 13 Jan 2021 12:38:08 +0000 NodeStatusNeverUpdated Kubelet never posted node status. MemoryPressure Unknown Wed, 13 Jan 2021 12:37:07 +0000 Wed, 13 Jan 2021 12:38:08 +0000 NodeStatusNeverUpdated Kubelet never posted node status. DiskPressure Unknown Wed, 13 Jan 2021 12:37:07 +0000 Wed, 13 Jan 2021 12:38:08 +0000 NodeStatusNeverUpdated Kubelet never posted node status. ... skipping 31 lines ... Roles: <none> Labels: <none> Annotations: node.alpha.kubernetes.io/ttl: 0 CreationTimestamp: Wed, 13 Jan 2021 12:37:07 +0000 Taints: node.kubernetes.io/unreachable:NoSchedule Unschedulable: false Lease: Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found Conditions: Type Status LastHeartbeatTime LastTransitionTime Reason Message ---- ------ ----------------- ------------------ ------ ------- Ready Unknown Wed, 13 Jan 2021 12:37:07 +0000 Wed, 13 Jan 2021 12:38:08 +0000 NodeStatusNeverUpdated Kubelet never posted node status. MemoryPressure Unknown Wed, 13 Jan 2021 12:37:07 +0000 Wed, 13 Jan 2021 12:38:08 +0000 NodeStatusNeverUpdated Kubelet never posted node status. DiskPressure Unknown Wed, 13 Jan 2021 12:37:07 +0000 Wed, 13 Jan 2021 12:38:08 +0000 NodeStatusNeverUpdated Kubelet never posted node status. ... skipping 22 lines ... Resource Requests Limits -------- -------- ------ cpu 0 (0%) 0 (0%) memory 0 (0%) 0 (0%) ephemeral-storage 0 (0%) 0 (0%) (B[m E0113 12:42:51.179624 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource [32mcore.sh:1543: Successful describe Name: 127.0.0.1 Roles: <none> Labels: <none> Annotations: node.alpha.kubernetes.io/ttl: 0 CreationTimestamp: Wed, 13 Jan 2021 12:37:07 +0000 Taints: node.kubernetes.io/unreachable:NoSchedule Unschedulable: false Lease: Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found Conditions: Type Status LastHeartbeatTime LastTransitionTime Reason Message ---- ------ ----------------- ------------------ ------ ------- Ready Unknown Wed, 13 Jan 2021 12:37:07 +0000 Wed, 13 Jan 2021 12:38:08 +0000 NodeStatusNeverUpdated Kubelet never posted node status. MemoryPressure Unknown Wed, 13 Jan 2021 12:37:07 +0000 Wed, 13 Jan 2021 12:38:08 +0000 NodeStatusNeverUpdated Kubelet never posted node status. DiskPressure Unknown Wed, 13 Jan 2021 12:37:07 +0000 Wed, 13 Jan 2021 12:38:08 +0000 NodeStatusNeverUpdated Kubelet never posted node status. ... skipping 38 lines ... Roles: <none> Labels: <none> Annotations: node.alpha.kubernetes.io/ttl: 0 CreationTimestamp: Wed, 13 Jan 2021 12:37:07 +0000 Taints: node.kubernetes.io/unreachable:NoSchedule Unschedulable: false Lease: Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found Conditions: Type Status LastHeartbeatTime LastTransitionTime Reason Message ---- ------ ----------------- ------------------ ------ ------- Ready Unknown Wed, 13 Jan 2021 12:37:07 +0000 Wed, 13 Jan 2021 12:38:08 +0000 NodeStatusNeverUpdated Kubelet never posted node status. MemoryPressure Unknown Wed, 13 Jan 2021 12:37:07 +0000 Wed, 13 Jan 2021 12:38:08 +0000 NodeStatusNeverUpdated Kubelet never posted node status. DiskPressure Unknown Wed, 13 Jan 2021 12:37:07 +0000 Wed, 13 Jan 2021 12:38:08 +0000 NodeStatusNeverUpdated Kubelet never posted node status. ... skipping 22 lines ... Resource Requests Limits -------- -------- ------ cpu 0 (0%) 0 (0%) memory 0 (0%) 0 (0%) ephemeral-storage 0 (0%) 0 (0%) Events: <none> (B[mE0113 12:42:51.415406 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource [32mSuccessful describe Name: 127.0.0.1 Roles: <none> Labels: <none> Annotations: node.alpha.kubernetes.io/ttl: 0 CreationTimestamp: Wed, 13 Jan 2021 12:37:07 +0000 Taints: node.kubernetes.io/unreachable:NoSchedule Unschedulable: false Lease: Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found Conditions: Type Status LastHeartbeatTime LastTransitionTime Reason Message ---- ------ ----------------- ------------------ ------ ------- Ready Unknown Wed, 13 Jan 2021 12:37:07 +0000 Wed, 13 Jan 2021 12:38:08 +0000 NodeStatusNeverUpdated Kubelet never posted node status. MemoryPressure Unknown Wed, 13 Jan 2021 12:37:07 +0000 Wed, 13 Jan 2021 12:38:08 +0000 NodeStatusNeverUpdated Kubelet never posted node status. DiskPressure Unknown Wed, 13 Jan 2021 12:37:07 +0000 Wed, 13 Jan 2021 12:38:08 +0000 NodeStatusNeverUpdated Kubelet never posted node status. ... skipping 30 lines ... Roles: <none> Labels: <none> Annotations: node.alpha.kubernetes.io/ttl: 0 CreationTimestamp: Wed, 13 Jan 2021 12:37:07 +0000 Taints: node.kubernetes.io/unreachable:NoSchedule Unschedulable: false Lease: Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found Conditions: Type Status LastHeartbeatTime LastTransitionTime Reason Message ---- ------ ----------------- ------------------ ------ ------- Ready Unknown Wed, 13 Jan 2021 12:37:07 +0000 Wed, 13 Jan 2021 12:38:08 +0000 NodeStatusNeverUpdated Kubelet never posted node status. MemoryPressure Unknown Wed, 13 Jan 2021 12:37:07 +0000 Wed, 13 Jan 2021 12:38:08 +0000 NodeStatusNeverUpdated Kubelet never posted node status. DiskPressure Unknown Wed, 13 Jan 2021 12:37:07 +0000 Wed, 13 Jan 2021 12:38:08 +0000 NodeStatusNeverUpdated Kubelet never posted node status. ... skipping 29 lines ... Roles: <none> Labels: <none> Annotations: node.alpha.kubernetes.io/ttl: 0 CreationTimestamp: Wed, 13 Jan 2021 12:37:07 +0000 Taints: node.kubernetes.io/unreachable:NoSchedule Unschedulable: false Lease: Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found Conditions: Type Status LastHeartbeatTime LastTransitionTime Reason Message ---- ------ ----------------- ------------------ ------ ------- Ready Unknown Wed, 13 Jan 2021 12:37:07 +0000 Wed, 13 Jan 2021 12:38:08 +0000 NodeStatusNeverUpdated Kubelet never posted node status. MemoryPressure Unknown Wed, 13 Jan 2021 12:37:07 +0000 Wed, 13 Jan 2021 12:38:08 +0000 NodeStatusNeverUpdated Kubelet never posted node status. DiskPressure Unknown Wed, 13 Jan 2021 12:37:07 +0000 Wed, 13 Jan 2021 12:38:08 +0000 NodeStatusNeverUpdated Kubelet never posted node status. ... skipping 135 lines ... yes has:the server doesn't have a resource type Successful message:yes has:yes Successful message:error: --subresource can not be used with NonResourceURL has:subresource can not be used with NonResourceURL Successful Successful message:yes 0 has:0 ... skipping 37 lines ... reconciliation required create missing rules added: {Verbs:[get list watch] APIGroups:[] Resources:[configmaps] ResourceNames:[] NonResourceURLs:[]} [32mlegacy-script.sh:831: Successful get rolebindings -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: (B[m[32mlegacy-script.sh:832: Successful get roles -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: (B[m[32mlegacy-script.sh:833: Successful get clusterrolebindings -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: (B[mE0113 12:42:54.827852 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource [32mlegacy-script.sh:834: Successful get clusterroles -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: (B[mclusterrole.rbac.authorization.k8s.io/testing-CR reconciled reconciliation required create missing rules added: {Verbs:[create delete deletecollection get list patch update watch] APIGroups:[] Resources:[pods] ResourceNames:[] NonResourceURLs:[]} clusterrolebinding.rbac.authorization.k8s.io/testing-CRB reconciled ... skipping 10 lines ... {Verbs:[get list watch] APIGroups:[] Resources:[configmaps] ResourceNames:[] NonResourceURLs:[]} [32mlegacy-script.sh:838: Successful get rolebindings -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-RB: (B[m[32mlegacy-script.sh:839: Successful get roles -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-R: (B[m[32mlegacy-script.sh:840: Successful get clusterrolebindings -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CRB: (B[m[32mlegacy-script.sh:841: Successful get clusterroles -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CR: (B[mSuccessful message:error: only rbac.authorization.k8s.io/v1 is supported: not *v1beta1.ClusterRole has:only rbac.authorization.k8s.io/v1 is supported rolebinding.rbac.authorization.k8s.io "testing-RB" deleted role.rbac.authorization.k8s.io "testing-R" deleted warning: deleting cluster-scoped resources, not scoped to the provided namespace clusterrole.rbac.authorization.k8s.io "testing-CR" deleted clusterrolebinding.rbac.authorization.k8s.io "testing-CRB" deleted ... skipping 24 lines ... [32mdiscovery.sh:91: Successful get all -l'app=cassandra' {{range.items}}{{range .metadata.labels}}{{.}}:{{end}}{{end}}: cassandra:cassandra:cassandra:cassandra: (B[mpod "cassandra-nhldv" deleted I0113 12:42:56.720958 58414 event.go:291] "Event occurred" object="namespace-1610541775-10595/cassandra" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: cassandra-v677v" pod "cassandra-v6mmr" deleted I0113 12:42:56.730664 58414 event.go:291] "Event occurred" object="namespace-1610541775-10595/cassandra" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: cassandra-hnscs" replicationcontroller "cassandra" deleted E0113 12:42:56.737256 58414 replica_set.go:532] sync "namespace-1610541775-10595/cassandra" failed with replicationcontrollers "cassandra" not found service "cassandra" deleted +++ exit code: 0 Recording: run_kubectl_explain_tests Running command: run_kubectl_explain_tests +++ Running case: test-cmd.run_kubectl_explain_tests ... skipping 1155 lines ... message:node/127.0.0.1 already uncordoned (server dry run) has:already uncordoned [32mnode-management.sh:145: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value> (B[mnode/127.0.0.1 labeled [32mnode-management.sh:150: Successful get nodes 127.0.0.1 {{.metadata.labels.test}}: label (B[mSuccessful message:error: cannot specify both a node name and a --selector option See 'kubectl drain -h' for help and examples has:cannot specify both a node name Successful message:error: USAGE: cordon NODE [flags] See 'kubectl cordon -h' for help and examples has:error\: USAGE\: cordon NODE node/127.0.0.1 already uncordoned Successful message:error: You must provide one or more resources by argument or filename. Example resource specifications include: '-f rsrc.yaml' '--filename=rsrc.json' '<resource> <name>' '<resource>' has:must provide one or more resources ... skipping 14 lines ... +++ [0113 12:43:28] Testing kubectl plugins Successful message:The following compatible plugins are available: test/fixtures/pkg/kubectl/plugins/version/kubectl-version - warning: kubectl-version overwrites existing command: "kubectl version" error: one plugin warning was found has:kubectl-version overwrites existing command: "kubectl version" Successful message:The following compatible plugins are available: test/fixtures/pkg/kubectl/plugins/kubectl-foo test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo - warning: test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin: test/fixtures/pkg/kubectl/plugins/kubectl-foo error: one plugin warning was found has:test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin Successful message:The following compatible plugins are available: test/fixtures/pkg/kubectl/plugins/kubectl-foo has:plugins are available Successful message:Unable read directory "test/fixtures/pkg/kubectl/plugins/empty" from your PATH: open test/fixtures/pkg/kubectl/plugins/empty: no such file or directory. Skipping... error: unable to find any kubectl plugins in your PATH has:unable to find any kubectl plugins in your PATH Successful message:I am plugin foo has:plugin foo Successful message:I am plugin bar called with args test/fixtures/pkg/kubectl/plugins/bar/kubectl-bar arg1 ... skipping 10 lines ... +++ Running case: test-cmd.run_impersonation_tests +++ working dir: /home/prow/go/src/k8s.io/kubernetes +++ command: run_impersonation_tests +++ [0113 12:43:28] Testing impersonation Successful message:error: requesting groups or user-extra for test-admin without impersonating a user has:without impersonating a user Warning: certificates.k8s.io/v1beta1 CertificateSigningRequest is deprecated in v1.19+, unavailable in v1.22+; use certificates.k8s.io/v1 CertificateSigningRequest certificatesigningrequest.certificates.k8s.io/foo created [32mauthorization.sh:68: Successful get csr/foo {{.spec.username}}: user1 (B[m[32mauthorization.sh:69: Successful get csr/foo {{range .spec.groups}}{{.}}{{end}}: system:authenticated (B[mWarning: certificates.k8s.io/v1beta1 CertificateSigningRequest is deprecated in v1.19+, unavailable in v1.22+; use certificates.k8s.io/v1 CertificateSigningRequest certificatesigningrequest.certificates.k8s.io "foo" deleted E0113 12:43:29.790362 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource Warning: certificates.k8s.io/v1beta1 CertificateSigningRequest is deprecated in v1.19+, unavailable in v1.22+; use certificates.k8s.io/v1 CertificateSigningRequest certificatesigningrequest.certificates.k8s.io/foo created [32mauthorization.sh:74: Successful get csr/foo {{len .spec.groups}}: 4 (B[m[32mauthorization.sh:75: Successful get csr/foo {{range .spec.groups}}{{.}} {{end}}: group2 group1 ,,,chameleon system:authenticated (B[mWarning: certificates.k8s.io/v1beta1 CertificateSigningRequest is deprecated in v1.19+, unavailable in v1.22+; use certificates.k8s.io/v1 CertificateSigningRequest certificatesigningrequest.certificates.k8s.io "foo" deleted ... skipping 12 lines ... deployment.apps/test-1 created I0113 12:43:30.419054 58414 event.go:291] "Event occurred" object="namespace-1610541810-14314/test-1-7487ff9cbb" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: test-1-7487ff9cbb-kdcp6" I0113 12:43:30.500487 58414 event.go:291] "Event occurred" object="namespace-1610541810-14314/test-2" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set test-2-646997777c to 1" I0113 12:43:30.503846 58414 event.go:291] "Event occurred" object="namespace-1610541810-14314/test-2-646997777c" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: test-2-646997777c-wqz4n" deployment.apps/test-2 created [32mwait.sh:36: Successful get deployments {{range .items}}{{.metadata.name}},{{end}}: test-1,test-2, (B[mE0113 12:43:30.888540 58414 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource deployment.apps "test-1" deleted deployment.apps "test-2" deleted Successful message:deployment.apps/test-1 condition met deployment.apps/test-2 condition met has:test-1 condition met ... skipping 59 lines ... pod "node-debugger-127.0.0.1-bxq55" force deleted +++ exit code: 0 warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely. No resources found warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely. No resources found FAILED TESTS: run_kubectl_request_timeout_tests, I0113 12:43:36.725962 54773 controller.go:89] Shutting down OpenAPI AggregationController I0113 12:43:36.726029 54773 controller.go:181] Shutting down kubernetes service endpoint reconciler I0113 12:43:36.726302 54773 naming_controller.go:302] Shutting down NamingConditionController I0113 12:43:36.726335 54773 cluster_authentication_trust_controller.go:463] Shutting down cluster_authentication_trust_controller controller I0113 12:43:36.726348 54773 controller.go:123] Shutting down OpenAPI controller I0113 12:43:36.726359 54773 apf_controller.go:259] Shutting down API Priority and Fairness config worker ... skipping 13 lines ... I0113 12:43:36.726832 54773 tlsconfig.go:255] Shutting down DynamicServingCertificateController I0113 12:43:36.726857 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.726860 54773 dynamic_serving_content.go:145] Shutting down serving-cert::/tmp/apiserver.crt::/tmp/apiserver.key I0113 12:43:36.727060 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.727114 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.727133 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick W0113 12:43:36.727208 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.727273 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... I0113 12:43:36.727319 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.727353 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick W0113 12:43:36.727374 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.727407 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... I0113 12:43:36.727413 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.727497 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.727515 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.727528 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.727582 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.727605 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.727665 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.727694 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.727747 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.727830 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.727882 54773 secure_serving.go:241] Stopped listening on 127.0.0.1:6443 W0113 12:43:36.727889 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... I0113 12:43:36.727958 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.728002 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.728030 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.728096 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.728130 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.728186 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.728224 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick W0113 12:43:36.728269 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... I0113 12:43:36.728356 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.728400 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.728418 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.728542 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.728544 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.728575 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick W0113 12:43:36.728620 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... I0113 12:43:36.728643 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick W0113 12:43:36.728681 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.728738 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... I0113 12:43:36.728770 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick W0113 12:43:36.728782 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... I0113 12:43:36.728881 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.728902 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.728902 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.728992 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.728992 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.729014 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.729089 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.729089 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.729122 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick W0113 12:43:36.729141 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.729174 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... I0113 12:43:36.729296 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.729327 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick W0113 12:43:36.729403 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... I0113 12:43:36.729404 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick W0113 12:43:36.729467 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... I0113 12:43:36.729505 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick W0113 12:43:36.729532 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.729578 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... I0113 12:43:36.729583 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.729608 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick W0113 12:43:36.729618 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.729625 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.729671 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... I0113 12:43:36.729712 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.729713 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick W0113 12:43:36.729720 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... I0113 12:43:36.729767 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.729798 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick W0113 12:43:36.729771 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... I0113 12:43:36.729818 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick W0113 12:43:36.729841 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.729864 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.729890 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... I0113 12:43:36.729927 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.729932 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.729963 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.728738 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick W0113 12:43:36.730022 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... I0113 12:43:36.730040 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.730025 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick W0113 12:43:36.730105 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... I0113 12:43:36.730113 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick W0113 12:43:36.730124 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... I0113 12:43:36.730149 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.730212 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick W0113 12:43:36.730233 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... I0113 12:43:36.730262 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.730310 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.730361 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.730469 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick I0113 12:43:36.727891 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick W0113 12:43:36.730555 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.730587 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.730627 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.730648 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.730687 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.730706 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.729798 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.730740 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.730758 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.730793 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.730818 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.730864 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... I0113 12:43:36.728269 54773 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick W0113 12:43:36.730872 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.730886 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.730910 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.730952 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.730963 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.731014 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.731022 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.731047 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.731076 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.731081 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.731101 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.731016 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.731129 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.731543 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.731566 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.731574 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.731605 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.731664 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.731717 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.731737 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.731764 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.731775 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.731782 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.731804 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.731819 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.731827 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.731835 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.731855 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.731865 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.731864 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.731881 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.731916 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.731960 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.732058 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:36.732075 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... junit report dir: /logs/artifacts +++ [0113 12:43:36] Clean up complete make: *** [Makefile:318: test-cmd] Error 1 + EXIT_VALUE=2 + set +o xtrace Cleaning up after docker in docker. ================================================================================ Cleaning up after docker Stopping Docker: dockerW0113 12:43:37.727684 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.727684 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.727703 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.727701 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.728152 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.728558 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.728922 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.728931 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.728978 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.728987 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.729403 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.729406 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.729562 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.729689 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.729730 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.729752 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.729783 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.729776 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.729847 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.729856 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.729927 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.729975 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.730030 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.730042 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.730296 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.730372 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.730391 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.730395 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.730825 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.730891 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.730904 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.730909 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.730964 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.730906 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.730983 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.730990 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.731012 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.731035 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.731036 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.731061 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.731091 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.731147 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.731154 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.731155 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.731181 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.731202 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.731220 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.731222 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.731269 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.731299 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.731310 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.731321 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.731359 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.731902 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.731943 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.732016 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.732032 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.732034 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.732045 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.732043 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.732092 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.732134 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.732148 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.732151 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.732157 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.732171 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.732096 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.732184 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.732098 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.732209 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.732211 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.732184 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.732097 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.732232 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:37.732255 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.016910 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.027377 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.027560 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.038644 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.053919 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.054880 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.058129 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.061000 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.062132 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.071515 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.081901 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.086804 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.087460 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.087460 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.089043 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.102143 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.113383 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.118975 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.120935 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.123538 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.123774 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.135716 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.143819 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.144076 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.160317 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.167902 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.172344 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.172489 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.176337 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.190064 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.203695 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.215744 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.219956 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.251330 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.252827 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.263451 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.266279 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.273351 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.279154 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.290308 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.311723 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.330438 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.346288 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.358156 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.365253 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.380540 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.383549 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.384545 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.387530 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.409067 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.415225 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.415480 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.433975 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.439631 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.440313 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.450234 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.472673 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.474704 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.494600 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.495764 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.518293 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.518829 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.526906 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.553567 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.556104 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.564097 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.571275 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.587745 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.590298 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.591624 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.607798 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.610615 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.614186 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.628634 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:39.634895 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.183822 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.222072 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.338847 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.345672 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.350725 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.395948 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.396855 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.413996 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.415449 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.417257 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.437115 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.446737 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.447266 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.527925 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.532489 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.544565 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.564302 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.573702 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.682222 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.683867 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.700738 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.700836 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.709060 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.748444 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.758804 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.761334 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.766382 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.779487 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.804748 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.837257 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.844740 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.846165 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.847079 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.850319 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.861022 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.873312 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.887990 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.888643 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.899380 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.902793 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.923647 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.937189 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.946017 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.954510 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.961169 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.969067 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:41.975033 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:42.003403 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:42.021714 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:42.040506 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:42.041869 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:42.047359 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:42.050527 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:42.065354 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:42.067663 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:42.071357 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:42.077848 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:42.094150 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:42.105515 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:42.116640 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:42.182480 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:42.187350 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:42.192683 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:42.194522 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:42.239045 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:42.251449 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:42.281408 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:42.288197 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:42.325207 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:42.348765 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:42.352505 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:42.353616 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:42.369444 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:42.466658 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:42.478985 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:44.774699 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:44.829866 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:44.902002 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:44.910836 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:44.997692 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:44.998669 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.020158 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.118729 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.135596 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.161523 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.253719 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.257120 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.270396 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.360897 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.363495 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.388877 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.420386 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.488244 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.609820 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.612179 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.634394 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.638150 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.711233 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.734101 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.768159 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.775368 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.779338 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.800731 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.802737 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.804091 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.806023 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.814316 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.838192 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.839774 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.871178 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.910231 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.923520 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.958988 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.961494 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.974531 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:45.993186 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.004993 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.011588 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.064774 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.171042 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.177051 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.193121 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.237594 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.240917 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.252643 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.274189 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.291151 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.302508 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.323501 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.337043 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.346780 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.363886 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.429346 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.431005 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.452962 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.473291 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.482541 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.501982 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.527529 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.559544 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.646213 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.665394 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.672207 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.741347 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.799088 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.833315 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.866012 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.937706 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:46.983399 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:47.086394 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:50.235744 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:50.309169 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:50.678090 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:50.830628 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:51.037916 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:51.038846 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:51.099020 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:51.117422 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:51.126204 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:51.224122 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:51.292056 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:51.400042 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:51.439611 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:51.485853 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:51.594818 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:51.600569 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:51.695660 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:51.732029 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:51.738176 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:51.763642 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:51.772678 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:51.828100 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:51.834626 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:51.837380 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:51.852648 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:51.865069 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:51.933047 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:51.960563 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:52.041874 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:52.043346 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:52.083374 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:52.123322 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:52.142676 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:52.168093 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:52.345116 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:52.381908 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:52.451760 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:52.474416 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:52.509165 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:52.518432 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:52.523664 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:52.624603 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:52.682830 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:52.748350 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:52.787949 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:52.838958 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:52.839546 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:52.871528 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:52.888794 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:52.900666 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:52.912993 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:52.918335 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:52.931577 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:53.071338 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:53.127652 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:53.164198 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:53.165388 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:53.165708 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:53.308659 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:53.339697 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:53.398333 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:53.424926 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:53.473532 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:53.531325 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:53.540452 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:53.619100 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:53.665499 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:53.743645 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:53.784644 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:53.939430 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:54.070545 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:54.245639 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:54.288918 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:54.422595 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:54.444378 54773 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting... W0113 12:43:56.726133 54773 controller.go:193] RemoveEndpoints() timed out Program process in pidfile '/var/run/docker-ssd.pid', 1 process(es), refused to die. ================================================================================ Done cleaning up after docker in docker.