This job view page is being replaced by Spyglass soon. Check out the new job view.
ResultFAILURE
Tests 1 failed / 629 succeeded
Started2019-03-15 08:12
Elapsed30m59s
Revision
Buildergke-prow-containerd-pool-99179761-wh4m
links{u'resultstore': {u'url': u'https://source.cloud.google.com/results/invocations/cb8b916f-8d99-42b6-a40b-cc3a214e05e8/targets/test'}}
pode72ba75a-46f9-11e9-ab9f-0a580a6c0a8e
resultstorehttps://source.cloud.google.com/results/invocations/cb8b916f-8d99-42b6-a40b-cc3a214e05e8/targets/test
infra-commit031c214dd
pode72ba75a-46f9-11e9-ab9f-0a580a6c0a8e
repok8s.io/kubernetes
repo-commitd5a3db003916b1d33b503ccd2e4897e094d8af77
repos{u'k8s.io/kubernetes': u'master'}

Test Failures


k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/test/integration TestValidateOnlyStatus 1.82s

go test -v k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/test/integration -run TestValidateOnlyStatus$
I0315 08:37:49.197555  126841 secure_serving.go:160] Stopped listening on 127.0.0.1:37073
I0315 08:37:49.198349  126841 serving.go:312] Generated self-signed cert (/tmp/apiextensions-apiserver437339230/apiserver.crt, /tmp/apiextensions-apiserver437339230/apiserver.key)
I0315 08:37:50.198737  126841 clientconn.go:551] parsed scheme: ""
I0315 08:37:50.198768  126841 clientconn.go:557] scheme "" not registered, fallback to default scheme
I0315 08:37:50.198813  126841 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0315 08:37:50.198855  126841 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0315 08:37:50.199266  126841 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:37:50.305672  126841 mutation_detector.go:48] Mutation detector is enabled, this will result in memory leakage.
I0315 08:37:50.306993  126841 clientconn.go:551] parsed scheme: ""
I0315 08:37:50.307023  126841 clientconn.go:557] scheme "" not registered, fallback to default scheme
I0315 08:37:50.307075  126841 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0315 08:37:50.307209  126841 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0315 08:37:50.307570  126841 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0315 08:37:50.307725  126841 clientconn.go:551] parsed scheme: ""
I0315 08:37:50.307747  126841 clientconn.go:557] scheme "" not registered, fallback to default scheme
I0315 08:37:50.307783  126841 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0315 08:37:50.307818  126841 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0315 08:37:50.308182  126841 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:37:50.309580  126841 mutation_detector.go:48] Mutation detector is enabled, this will result in memory leakage.
I0315 08:37:50.311017  126841 secure_serving.go:116] Serving securely on 127.0.0.1:40945
I0315 08:37:50.311115  126841 crd_finalizer.go:242] Starting CRDFinalizer
I0315 08:37:50.311143  126841 establishing_controller.go:73] Starting EstablishingController
I0315 08:37:50.311150  126841 customresource_discovery_controller.go:208] Starting DiscoveryController
I0315 08:37:50.311198  126841 naming_controller.go:284] Starting NamingConditionController
E0315 08:37:50.311673  126841 reflector.go:126] k8s.io/client-go/informers/factory.go:133: Failed to list *v1.Service: Get http://127.1.2.3:12345/api/v1/services?limit=500&resourceVersion=0: dial tcp 127.1.2.3:12345: connect: connection refused
I0315 08:37:50.919251  126841 clientconn.go:551] parsed scheme: ""
I0315 08:37:50.919280  126841 clientconn.go:557] scheme "" not registered, fallback to default scheme
I0315 08:37:50.919316  126841 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0315 08:37:50.919377  126841 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0315 08:37:50.919808  126841 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0315 08:37:50.920407  126841 clientconn.go:551] parsed scheme: ""
I0315 08:37:50.920433  126841 clientconn.go:557] scheme "" not registered, fallback to default scheme
I0315 08:37:50.920470  126841 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0315 08:37:50.920521  126841 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0315 08:37:50.920888  126841 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
testserver.go:141: runtime-config=map[api/all:true]
testserver.go:142: Starting apiextensions-apiserver on port 40945...
testserver.go:160: Waiting for /healthz to be ok...
subresources_test.go:537: unexpected error: WishIHadChosenNoxu.mygroup.example.com "foo" is invalid: apiVersion: Invalid value: "mygroup.example.com/v1beta1": must be mygroup.example.com/v1
panic: runtime error: invalid memory address or nil pointer dereference
/usr/local/go/src/testing/testing.go:792 +0x387
/usr/local/go/src/runtime/panic.go:513 +0x1b9
/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/test/integration/subresources_test.go:541 +0xa27
/usr/local/go/src/testing/testing.go:827 +0xbf
/usr/local/go/src/testing/testing.go:878 +0x35c
				from junit_d431ed5f68ae4ddf888439fb96b687a923412204_20190315-082941.xml

Filter through log files | View test history on testgrid


Show 629 Passed Tests

Show 4 Skipped Tests

Error lines from build-log.txt

... skipping 302 lines ...
W0315 08:23:57.056] I0315 08:23:57.055499   55692 serving.go:312] Generated self-signed cert (/tmp/apiserver.crt, /tmp/apiserver.key)
W0315 08:23:57.056] I0315 08:23:57.055579   55692 server.go:559] external host was not specified, using 172.17.0.2
W0315 08:23:57.057] W0315 08:23:57.055591   55692 authentication.go:415] AnonymousAuth is not allowed with the AlwaysAllow authorizer. Resetting AnonymousAuth to false. You should use a different authorizer
W0315 08:23:57.057] I0315 08:23:57.055966   55692 server.go:146] Version: v1.15.0-alpha.0.1222+d5a3db003916b1
W0315 08:23:57.451] I0315 08:23:57.450816   55692 plugins.go:158] Loaded 4 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority.
W0315 08:23:57.452] I0315 08:23:57.450859   55692 plugins.go:161] Loaded 4 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ResourceQuota.
W0315 08:23:57.452] E0315 08:23:57.451449   55692 prometheus.go:138] failed to register depth metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 08:23:57.452] E0315 08:23:57.451521   55692 prometheus.go:150] failed to register adds metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 08:23:57.453] E0315 08:23:57.451573   55692 prometheus.go:162] failed to register latency metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 08:23:57.453] E0315 08:23:57.451621   55692 prometheus.go:174] failed to register work_duration metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 08:23:57.453] E0315 08:23:57.451647   55692 prometheus.go:189] failed to register unfinished_work_seconds metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 08:23:57.453] E0315 08:23:57.451671   55692 prometheus.go:202] failed to register longest_running_processor_microseconds metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 08:23:57.454] I0315 08:23:57.451687   55692 plugins.go:158] Loaded 4 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority.
W0315 08:23:57.454] I0315 08:23:57.451692   55692 plugins.go:161] Loaded 4 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ResourceQuota.
W0315 08:23:57.454] I0315 08:23:57.453229   55692 clientconn.go:551] parsed scheme: ""
W0315 08:23:57.455] I0315 08:23:57.453260   55692 clientconn.go:557] scheme "" not registered, fallback to default scheme
W0315 08:23:57.455] I0315 08:23:57.453310   55692 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
W0315 08:23:57.455] I0315 08:23:57.453371   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
... skipping 361 lines ...
W0315 08:23:57.979] W0315 08:23:57.978665   55692 genericapiserver.go:344] Skipping API storage.k8s.io/v1alpha1 because it has no resources.
W0315 08:23:58.449] I0315 08:23:58.449272   55692 clientconn.go:551] parsed scheme: ""
W0315 08:23:58.450] I0315 08:23:58.449344   55692 clientconn.go:557] scheme "" not registered, fallback to default scheme
W0315 08:23:58.450] I0315 08:23:58.449396   55692 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
W0315 08:23:58.450] I0315 08:23:58.449461   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:23:58.451] I0315 08:23:58.450051   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:23:58.827] E0315 08:23:58.827207   55692 prometheus.go:138] failed to register depth metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 08:23:58.828] E0315 08:23:58.827262   55692 prometheus.go:150] failed to register adds metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 08:23:58.828] E0315 08:23:58.827416   55692 prometheus.go:162] failed to register latency metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 08:23:58.828] E0315 08:23:58.827523   55692 prometheus.go:174] failed to register work_duration metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 08:23:58.829] E0315 08:23:58.827554   55692 prometheus.go:189] failed to register unfinished_work_seconds metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 08:23:58.829] E0315 08:23:58.827575   55692 prometheus.go:202] failed to register longest_running_processor_microseconds metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 08:23:58.829] I0315 08:23:58.827605   55692 plugins.go:158] Loaded 4 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority.
W0315 08:23:58.829] I0315 08:23:58.827746   55692 plugins.go:161] Loaded 4 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ResourceQuota.
W0315 08:23:58.829] I0315 08:23:58.829217   55692 clientconn.go:551] parsed scheme: ""
W0315 08:23:58.829] I0315 08:23:58.829239   55692 clientconn.go:557] scheme "" not registered, fallback to default scheme
W0315 08:23:58.830] I0315 08:23:58.829274   55692 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
W0315 08:23:58.830] I0315 08:23:58.829326   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
... skipping 230 lines ...
W0315 08:24:35.765] I0315 08:24:35.764354   58979 controllermanager.go:497] Started "cronjob"
W0315 08:24:35.765] I0315 08:24:35.764555   58979 cronjob_controller.go:94] Starting CronJob Manager
W0315 08:24:35.765] I0315 08:24:35.765101   58979 controllermanager.go:497] Started "csrapproving"
W0315 08:24:35.765] W0315 08:24:35.765122   58979 controllermanager.go:489] Skipping "nodeipam"
W0315 08:24:35.765] I0315 08:24:35.765252   58979 certificate_controller.go:113] Starting certificate controller
W0315 08:24:35.765] I0315 08:24:35.765274   58979 controller_utils.go:1027] Waiting for caches to sync for certificate controller
W0315 08:24:35.766] E0315 08:24:35.765636   58979 core.go:77] Failed to start service controller: WARNING: no cloud provider provided, services of type LoadBalancer will fail
W0315 08:24:35.766] W0315 08:24:35.765655   58979 controllermanager.go:489] Skipping "service"
W0315 08:24:35.766] I0315 08:24:35.766105   58979 controllermanager.go:497] Started "pv-protection"
W0315 08:24:35.767] I0315 08:24:35.766383   58979 pv_protection_controller.go:81] Starting PV protection controller
W0315 08:24:35.767] I0315 08:24:35.766404   58979 controller_utils.go:1027] Waiting for caches to sync for PV protection controller
W0315 08:24:35.767] I0315 08:24:35.766487   58979 controllermanager.go:497] Started "serviceaccount"
W0315 08:24:35.767] I0315 08:24:35.766549   58979 serviceaccounts_controller.go:115] Starting service account controller
... skipping 5 lines ...
W0315 08:24:35.768] I0315 08:24:35.767781   58979 replica_set.go:182] Starting replicaset controller
W0315 08:24:35.769] I0315 08:24:35.767805   58979 controller_utils.go:1027] Waiting for caches to sync for ReplicaSet controller
W0315 08:24:35.769] I0315 08:24:35.768095   58979 controllermanager.go:497] Started "horizontalpodautoscaling"
W0315 08:24:35.769] I0315 08:24:35.768293   58979 horizontal.go:156] Starting HPA controller
W0315 08:24:35.769] I0315 08:24:35.768307   58979 controller_utils.go:1027] Waiting for caches to sync for HPA controller
W0315 08:24:35.769] I0315 08:24:35.768342   58979 node_lifecycle_controller.go:77] Sending events to api server
W0315 08:24:35.770] E0315 08:24:35.768383   58979 core.go:161] failed to start cloud node lifecycle controller: no cloud provider provided
W0315 08:24:35.770] W0315 08:24:35.768392   58979 controllermanager.go:489] Skipping "cloud-node-lifecycle"
W0315 08:24:35.770] I0315 08:24:35.769154   58979 controllermanager.go:497] Started "persistentvolume-binder"
W0315 08:24:35.771] I0315 08:24:35.769831   58979 pv_controller_base.go:270] Starting persistent volume controller
W0315 08:24:35.771] I0315 08:24:35.770031   58979 controller_utils.go:1027] Waiting for caches to sync for persistent volume controller
W0315 08:24:35.975] I0315 08:24:35.975122   58979 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for endpoints
W0315 08:24:35.976] I0315 08:24:35.975221   58979 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for serviceaccounts
... skipping 15 lines ...
W0315 08:24:35.979] I0315 08:24:35.976304   58979 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for controllerrevisions.apps
W0315 08:24:35.979] I0315 08:24:35.976343   58979 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for rolebindings.rbac.authorization.k8s.io
W0315 08:24:35.979] I0315 08:24:35.976393   58979 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for podtemplates
W0315 08:24:35.979] I0315 08:24:35.976431   58979 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for statefulsets.apps
W0315 08:24:35.979] I0315 08:24:35.976462   58979 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for horizontalpodautoscalers.autoscaling
W0315 08:24:35.979] I0315 08:24:35.976505   58979 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for ingresses.networking.k8s.io
W0315 08:24:35.980] E0315 08:24:35.976530   58979 resource_quota_controller.go:171] initial monitor sync has error: couldn't start monitor for resource "extensions/v1beta1, Resource=networkpolicies": unable to monitor quota for resource "extensions/v1beta1, Resource=networkpolicies"
W0315 08:24:35.980] I0315 08:24:35.976583   58979 controllermanager.go:497] Started "resourcequota"
W0315 08:24:35.980] I0315 08:24:35.976658   58979 resource_quota_controller.go:276] Starting resource quota controller
W0315 08:24:35.980] I0315 08:24:35.976738   58979 controller_utils.go:1027] Waiting for caches to sync for resource quota controller
W0315 08:24:35.980] I0315 08:24:35.976765   58979 resource_quota_monitor.go:301] QuotaMonitor running
I0315 08:24:36.151] +++ [0315 08:24:36] On try 3, controller-manager: ok
I0315 08:24:36.361] node/127.0.0.1 created
... skipping 56 lines ...
W0315 08:24:36.482] I0315 08:24:36.402191   58979 controllermanager.go:497] Started "persistentvolume-expander"
W0315 08:24:36.482] I0315 08:24:36.402327   58979 expand_controller.go:153] Starting expand controller
W0315 08:24:36.482] I0315 08:24:36.402362   58979 controller_utils.go:1027] Waiting for caches to sync for expand controller
W0315 08:24:36.482] I0315 08:24:36.402937   58979 controllermanager.go:497] Started "clusterrole-aggregation"
W0315 08:24:36.482] I0315 08:24:36.402988   58979 clusterroleaggregation_controller.go:148] Starting ClusterRoleAggregator
W0315 08:24:36.483] I0315 08:24:36.403003   58979 controller_utils.go:1027] Waiting for caches to sync for ClusterRoleAggregator controller
W0315 08:24:36.483] W0315 08:24:36.439243   58979 actual_state_of_world.go:503] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="127.0.0.1" does not exist
W0315 08:24:36.483] I0315 08:24:36.466544   58979 controller_utils.go:1034] Caches are synced for PV protection controller
W0315 08:24:36.483] I0315 08:24:36.466768   58979 controller_utils.go:1034] Caches are synced for service account controller
W0315 08:24:36.483] I0315 08:24:36.470886   55692 controller.go:606] quota admission added evaluator for: serviceaccounts
W0315 08:24:36.483] I0315 08:24:36.474050   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:24:36.484] I0315 08:24:36.474727   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:24:36.492] I0315 08:24:36.491641   58979 controller_utils.go:1034] Caches are synced for namespace controller
... skipping 43 lines ...
W0315 08:24:37.269] I0315 08:24:37.100859   58979 taint_manager.go:198] Starting NoExecuteTaintManager
W0315 08:24:37.269] I0315 08:24:37.101043   58979 node_lifecycle_controller.go:1009] Controller detected that all Nodes are not-Ready. Entering master disruption mode.
W0315 08:24:37.277] I0315 08:24:37.277037   58979 controller_utils.go:1034] Caches are synced for resource quota controller
W0315 08:24:37.284] I0315 08:24:37.283493   58979 controller_utils.go:1034] Caches are synced for garbage collector controller
W0315 08:24:37.284] I0315 08:24:37.283527   58979 garbagecollector.go:139] Garbage collector: all resource monitors have synced. Proceeding to collect garbage
W0315 08:24:37.304] I0315 08:24:37.303251   58979 controller_utils.go:1034] Caches are synced for ClusterRoleAggregator controller
W0315 08:24:37.317] E0315 08:24:37.317006   58979 clusterroleaggregation_controller.go:180] edit failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "edit": the object has been modified; please apply your changes to the latest version and try again
W0315 08:24:37.318] E0315 08:24:37.317823   58979 clusterroleaggregation_controller.go:180] view failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "view": the object has been modified; please apply your changes to the latest version and try again
W0315 08:24:37.329] E0315 08:24:37.328849   58979 clusterroleaggregation_controller.go:180] edit failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "edit": the object has been modified; please apply your changes to the latest version and try again
I0315 08:24:37.429] Successful: the flag '--client' shows correct client info
I0315 08:24:37.430] (BSuccessful: the flag '--client' correctly has no server version info
I0315 08:24:37.430] (B+++ [0315 08:24:37] Testing kubectl version: verify json output
I0315 08:24:37.492] Successful: --output json has correct client info
I0315 08:24:37.498] (BSuccessful: --output json has correct server info
I0315 08:24:37.500] (B+++ [0315 08:24:37] Testing kubectl version: verify json output using additional --client flag does not contain serverVersion
W0315 08:24:37.600] I0315 08:24:37.475065   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:24:37.601] I0315 08:24:37.475267   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:24:37.676] E0315 08:24:37.674887   58979 resource_quota_controller.go:437] failed to sync resource monitors: couldn't start monitor for resource "extensions/v1beta1, Resource=networkpolicies": unable to monitor quota for resource "extensions/v1beta1, Resource=networkpolicies"
I0315 08:24:37.784] Successful: --client --output json has correct client info
I0315 08:24:37.785] (BSuccessful: --client --output json has no server info
I0315 08:24:37.788] (B+++ [0315 08:24:37] Testing kubectl version: compare json output using additional --short flag
I0315 08:24:37.821] Successful: --short --output client json info is equal to non short result
I0315 08:24:37.827] (BSuccessful: --short --output server json info is equal to non short result
I0315 08:24:37.830] (B+++ [0315 08:24:37] Testing kubectl version: compare json output with yaml output
... skipping 54 lines ...
I0315 08:24:40.626] namespace/namespace-1552638280-21588 created
I0315 08:24:40.698] Context "test" modified.
I0315 08:24:40.705] +++ [0315 08:24:40] Testing RESTMapper
W0315 08:24:40.805] /go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh: line 147: 59701 Terminated              kubectl proxy --port=0 --www=. --api-prefix="$1" > ${PROXY_PORT_FILE} 2>&1
W0315 08:24:40.806] I0315 08:24:40.476832   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:24:40.806] I0315 08:24:40.477133   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 08:24:40.907] +++ [0315 08:24:40] "kubectl get unknownresourcetype" returns error as expected: error: the server doesn't have a resource type "unknownresourcetype"
I0315 08:24:40.907] +++ exit code: 0
I0315 08:24:40.940] NAME                              SHORTNAMES   APIGROUP                       NAMESPACED   KIND
I0315 08:24:40.940] bindings                                                                      true         Binding
I0315 08:24:40.941] componentstatuses                 cs                                          false        ComponentStatus
I0315 08:24:40.941] configmaps                        cm                                          true         ConfigMap
I0315 08:24:40.941] endpoints                         ep                                          true         Endpoints
... skipping 694 lines ...
I0315 08:25:00.333] (Bpoddisruptionbudget.policy/test-pdb-3 created
I0315 08:25:00.427] core.sh:251: Successful get pdb/test-pdb-3 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 2
I0315 08:25:00.502] (Bpoddisruptionbudget.policy/test-pdb-4 created
I0315 08:25:00.596] core.sh:255: Successful get pdb/test-pdb-4 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 50%
I0315 08:25:00.756] (Bcore.sh:261: Successful get pods --namespace=test-kubectl-describe-pod {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 08:25:00.957] (Bpod/env-test-pod created
W0315 08:25:01.058] error: resource(s) were provided, but no name, label selector, or --all flag specified
W0315 08:25:01.058] I0315 08:24:58.487195   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:25:01.059] I0315 08:24:58.487388   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:25:01.059] error: setting 'all' parameter but found a non empty selector. 
W0315 08:25:01.059] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0315 08:25:01.059] I0315 08:24:59.487647   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:25:01.059] I0315 08:24:59.487844   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:25:01.059] I0315 08:24:59.994733   55692 controller.go:606] quota admission added evaluator for: poddisruptionbudgets.policy
W0315 08:25:01.060] I0315 08:25:00.488134   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:25:01.060] I0315 08:25:00.488313   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:25:01.060] error: min-available and max-unavailable cannot be both specified
I0315 08:25:01.160] core.sh:264: Successful describe pods --namespace=test-kubectl-describe-pod env-test-pod:
I0315 08:25:01.161] Name:               env-test-pod
I0315 08:25:01.161] Namespace:          test-kubectl-describe-pod
I0315 08:25:01.161] Priority:           0
I0315 08:25:01.161] PriorityClassName:  <none>
I0315 08:25:01.161] Node:               <none>
... skipping 169 lines ...
W0315 08:25:13.254] I0315 08:25:12.494312   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:25:13.254] I0315 08:25:12.795604   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552638307-20575", Name:"modified", UID:"da7cf338-46fb-11e9-af77-0242ac110002", APIVersion:"v1", ResourceVersion:"379", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: modified-g6h7l
I0315 08:25:13.413] core.sh:434: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 08:25:13.570] (Bpod/valid-pod created
I0315 08:25:13.667] core.sh:438: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0315 08:25:13.832] (BSuccessful
I0315 08:25:13.832] message:Error from server: cannot restore map from string
I0315 08:25:13.832] has:cannot restore map from string
I0315 08:25:13.926] Successful
I0315 08:25:13.927] message:pod/valid-pod patched (no change)
I0315 08:25:13.927] has:patched (no change)
I0315 08:25:14.012] pod/valid-pod patched
I0315 08:25:14.106] core.sh:455: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
... skipping 5 lines ...
I0315 08:25:14.626] (Bpod/valid-pod patched
I0315 08:25:14.723] core.sh:470: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: changed-with-yaml:
I0315 08:25:14.803] (Bpod/valid-pod patched
I0315 08:25:14.903] core.sh:475: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:3.1:
I0315 08:25:15.078] (Bpod/valid-pod patched
I0315 08:25:15.181] core.sh:491: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
I0315 08:25:15.361] (B+++ [0315 08:25:15] "kubectl patch with resourceVersion 499" returns error as expected: Error from server (Conflict): Operation cannot be fulfilled on pods "valid-pod": the object has been modified; please apply your changes to the latest version and try again
W0315 08:25:15.461] I0315 08:25:13.494525   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:25:15.462] I0315 08:25:13.494794   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:25:15.462] E0315 08:25:13.823742   55692 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"cannot restore map from string"}
W0315 08:25:15.462] I0315 08:25:14.495045   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:25:15.463] I0315 08:25:14.495276   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:25:15.496] I0315 08:25:15.495576   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:25:15.496] I0315 08:25:15.495852   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 08:25:15.612] pod "valid-pod" deleted
I0315 08:25:15.626] pod/valid-pod replaced
I0315 08:25:15.742] core.sh:515: Successful get pod valid-pod {{(index .spec.containers 0).name}}: replaced-k8s-serve-hostname
I0315 08:25:15.918] (BSuccessful
I0315 08:25:15.919] message:error: --grace-period must have --force specified
I0315 08:25:15.919] has:\-\-grace-period must have \-\-force specified
I0315 08:25:16.088] Successful
I0315 08:25:16.089] message:error: --timeout must have --force specified
I0315 08:25:16.089] has:\-\-timeout must have \-\-force specified
I0315 08:25:16.249] node/node-v1-test created
W0315 08:25:16.350] W0315 08:25:16.249393   58979 actual_state_of_world.go:503] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="node-v1-test" does not exist
I0315 08:25:16.451] node/node-v1-test replaced
I0315 08:25:16.514] core.sh:552: Successful get node node-v1-test {{.metadata.annotations.a}}: b
I0315 08:25:16.592] (Bnode "node-v1-test" deleted
I0315 08:25:16.689] core.sh:559: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
I0315 08:25:16.986] (Bcore.sh:562: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/serve_hostname:
I0315 08:25:18.024] (Bcore.sh:575: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
... skipping 30 lines ...
W0315 08:25:19.469] Edit cancelled, no changes made.
W0315 08:25:19.470] I0315 08:25:17.496807   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:25:19.470] I0315 08:25:17.497071   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:25:19.470] Edit cancelled, no changes made.
W0315 08:25:19.470] Edit cancelled, no changes made.
W0315 08:25:19.470] Edit cancelled, no changes made.
W0315 08:25:19.470] error: 'name' already has a value (valid-pod), and --overwrite is false
W0315 08:25:19.471] I0315 08:25:18.497352   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:25:19.471] I0315 08:25:18.497619   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:25:19.471] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0315 08:25:19.498] I0315 08:25:19.498002   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:25:19.498] I0315 08:25:19.498233   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 08:25:19.599] core.sh:614: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: redis-master:valid-pod:
... skipping 86 lines ...
I0315 08:25:25.939] +++ Running case: test-cmd.run_kubectl_create_error_tests 
I0315 08:25:25.941] +++ working dir: /go/src/k8s.io/kubernetes
I0315 08:25:25.943] +++ command: run_kubectl_create_error_tests
I0315 08:25:25.954] +++ [0315 08:25:25] Creating namespace namespace-1552638325-12741
I0315 08:25:26.026] namespace/namespace-1552638325-12741 created
I0315 08:25:26.097] Context "test" modified.
I0315 08:25:26.103] +++ [0315 08:25:26] Testing kubectl create with error
W0315 08:25:26.203] I0315 08:25:25.501294   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:25:26.204] I0315 08:25:25.501479   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:25:26.204] Error: must specify one of -f and -k
W0315 08:25:26.204] 
W0315 08:25:26.205] Create a resource from a file or from stdin.
W0315 08:25:26.205] 
W0315 08:25:26.205]  JSON and YAML formats are accepted.
W0315 08:25:26.205] 
W0315 08:25:26.205] Examples:
... skipping 41 lines ...
W0315 08:25:26.214] 
W0315 08:25:26.214] Usage:
W0315 08:25:26.214]   kubectl create -f FILENAME [options]
W0315 08:25:26.214] 
W0315 08:25:26.214] Use "kubectl <command> --help" for more information about a given command.
W0315 08:25:26.215] Use "kubectl options" for a list of global command-line options (applies to all commands).
I0315 08:25:26.337] +++ [0315 08:25:26] "kubectl create with empty string list returns error as expected: error: error validating "hack/testdata/invalid-rc-with-empty-args.yaml": error validating data: ValidationError(ReplicationController.spec.template.spec.containers[0].args): unknown object type "nil" in ReplicationController.spec.template.spec.containers[0].args[0]; if you choose to ignore these errors, turn validation off with --validate=false
W0315 08:25:26.438] kubectl convert is DEPRECATED and will be removed in a future version.
W0315 08:25:26.438] In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
W0315 08:25:26.502] I0315 08:25:26.501753   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:25:26.502] I0315 08:25:26.501994   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 08:25:26.603] +++ exit code: 0
I0315 08:25:26.604] Recording: run_kubectl_apply_tests
... skipping 25 lines ...
W0315 08:25:28.631] I0315 08:25:28.631081   55692 clientconn.go:551] parsed scheme: ""
W0315 08:25:28.632] I0315 08:25:28.631114   55692 clientconn.go:557] scheme "" not registered, fallback to default scheme
W0315 08:25:28.632] I0315 08:25:28.631156   55692 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
W0315 08:25:28.632] I0315 08:25:28.631237   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:25:28.632] I0315 08:25:28.631984   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:25:28.635] I0315 08:25:28.634673   55692 controller.go:606] quota admission added evaluator for: resources.mygroup.example.com
W0315 08:25:28.728] Error from server (NotFound): resources.mygroup.example.com "myobj" not found
I0315 08:25:28.829] kind.mygroup.example.com/myobj serverside-applied (server dry run)
I0315 08:25:28.829] customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
I0315 08:25:28.849] +++ exit code: 0
I0315 08:25:28.883] Recording: run_kubectl_run_tests
I0315 08:25:28.883] Running command: run_kubectl_run_tests
I0315 08:25:28.902] 
... skipping 84 lines ...
I0315 08:25:31.316] Context "test" modified.
I0315 08:25:31.322] +++ [0315 08:25:31] Testing kubectl create filter
I0315 08:25:31.413] create.sh:30: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 08:25:31.573] (Bpod/selector-test-pod created
I0315 08:25:31.667] create.sh:34: Successful get pods selector-test-pod {{.metadata.labels.name}}: selector-test-pod
I0315 08:25:31.760] (BSuccessful
I0315 08:25:31.761] message:Error from server (NotFound): pods "selector-test-pod-dont-apply" not found
I0315 08:25:31.761] has:pods "selector-test-pod-dont-apply" not found
I0315 08:25:31.842] pod "selector-test-pod" deleted
I0315 08:25:31.862] +++ exit code: 0
I0315 08:25:31.895] Recording: run_kubectl_apply_deployments_tests
I0315 08:25:31.895] Running command: run_kubectl_apply_deployments_tests
I0315 08:25:31.915] 
... skipping 55 lines ...
W0315 08:25:34.334] I0315 08:25:32.538578   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638331-878", Name:"my-depl-656cffcbcc", UID:"e64175f9-46fb-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"555", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: my-depl-656cffcbcc-7cpx8
W0315 08:25:34.334] I0315 08:25:33.075600   58979 event.go:209] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1552638331-878", Name:"my-depl", UID:"e6404b89-46fb-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"563", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set my-depl-64775887d7 to 1
W0315 08:25:34.335] I0315 08:25:33.078946   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638331-878", Name:"my-depl-64775887d7", UID:"e6944410-46fb-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"565", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: my-depl-64775887d7-b9trc
W0315 08:25:34.335] I0315 08:25:33.505191   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:25:34.335] I0315 08:25:33.505371   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:25:34.336] I0315 08:25:33.681310   55692 controller.go:606] quota admission added evaluator for: replicasets.extensions
W0315 08:25:34.336] E0315 08:25:33.707180   58979 replica_set.go:450] Sync "namespace-1552638331-878/my-depl-64775887d7" failed with replicasets.apps "my-depl-64775887d7" not found
W0315 08:25:34.336] I0315 08:25:34.231917   58979 event.go:209] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1552638331-878", Name:"nginx", UID:"e743e96d-46fb-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"594", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-776cc67f78 to 3
W0315 08:25:34.337] I0315 08:25:34.236664   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638331-878", Name:"nginx-776cc67f78", UID:"e744b3e5-46fb-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"595", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-776cc67f78-vfvm2
W0315 08:25:34.337] I0315 08:25:34.241369   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638331-878", Name:"nginx-776cc67f78", UID:"e744b3e5-46fb-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"595", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-776cc67f78-vmj2s
W0315 08:25:34.337] I0315 08:25:34.241411   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638331-878", Name:"nginx-776cc67f78", UID:"e744b3e5-46fb-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"595", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-776cc67f78-8hjvr
I0315 08:25:34.438] apps.sh:147: Successful get deployment nginx {{.metadata.name}}: nginx
I0315 08:25:38.585] (BSuccessful
I0315 08:25:38.586] message:Error from server (Conflict): error when applying patch:
I0315 08:25:38.587] {"metadata":{"annotations":{"kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"extensions/v1beta1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1552638331-878\",\"resourceVersion\":\"99\"},\"spec\":{\"replicas\":3,\"selector\":{\"matchLabels\":{\"name\":\"nginx2\"}},\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx2\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n"},"resourceVersion":"99"},"spec":{"selector":{"matchLabels":{"name":"nginx2"}},"template":{"metadata":{"labels":{"name":"nginx2"}}}}}
I0315 08:25:38.587] to:
I0315 08:25:38.587] Resource: "extensions/v1beta1, Resource=deployments", GroupVersionKind: "extensions/v1beta1, Kind=Deployment"
I0315 08:25:38.587] Name: "nginx", Namespace: "namespace-1552638331-878"
I0315 08:25:38.590] Object: &{map["status":map["observedGeneration":'\x01' "replicas":'\x03' "updatedReplicas":'\x03' "unavailableReplicas":'\x03' "conditions":[map["reason":"MinimumReplicasUnavailable" "message":"Deployment does not have minimum availability." "type":"Available" "status":"False" "lastUpdateTime":"2019-03-15T08:25:34Z" "lastTransitionTime":"2019-03-15T08:25:34Z"]]] "kind":"Deployment" "apiVersion":"extensions/v1beta1" "metadata":map["generation":'\x01' "creationTimestamp":"2019-03-15T08:25:34Z" "managedFields":[map["manager":"kube-controller-manager" "operation":"Update" "apiVersion":"apps/v1" "time":"2019-03-15T08:25:34Z" "fields":map["f:metadata":map["f:annotations":map["f:deployment.kubernetes.io/revision":map[]]] "f:status":map["f:updatedReplicas":map[] "f:conditions":map[".":map[] "k:{\"type\":\"Available\"}":map["f:lastTransitionTime":map[] "f:lastUpdateTime":map[] "f:message":map[] "f:reason":map[] "f:status":map[] "f:type":map[] ".":map[]]] "f:observedGeneration":map[] "f:replicas":map[] "f:unavailableReplicas":map[]]]] map["time":"2019-03-15T08:25:34Z" "fields":map["f:metadata":map["f:annotations":map[".":map[] "f:kubectl.kubernetes.io/last-applied-configuration":map[]] "f:labels":map[".":map[] "f:name":map[]]] "f:spec":map["f:progressDeadlineSeconds":map[] "f:replicas":map[] "f:revisionHistoryLimit":map[] "f:selector":map[".":map[] "f:matchLabels":map[".":map[] "f:name":map[]]] "f:strategy":map["f:type":map[] "f:rollingUpdate":map["f:maxUnavailable":map[] ".":map[] "f:maxSurge":map[]]] "f:template":map["f:metadata":map["f:labels":map[".":map[] "f:name":map[]]] "f:spec":map["f:containers":map["k:{\"name\":\"nginx\"}":map["f:name":map[] "f:ports":map["k:{\"containerPort\":80,\"protocol\":\"TCP\"}":map["f:containerPort":map[] "f:protocol":map[] ".":map[]] ".":map[]] "f:resources":map[] "f:terminationMessagePath":map[] "f:terminationMessagePolicy":map[] ".":map[] "f:image":map[] "f:imagePullPolicy":map[]]] "f:dnsPolicy":map[] "f:restartPolicy":map[] "f:schedulerName":map[] "f:securityContext":map[] "f:terminationGracePeriodSeconds":map[]]]]] "manager":"kubectl" "operation":"Update" "apiVersion":"extensions/v1beta1"]] "namespace":"namespace-1552638331-878" "selfLink":"/apis/extensions/v1beta1/namespaces/namespace-1552638331-878/deployments/nginx" "resourceVersion":"607" "annotations":map["deployment.kubernetes.io/revision":"1" "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"extensions/v1beta1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1552638331-878\"},\"spec\":{\"replicas\":3,\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx1\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n"] "name":"nginx" "uid":"e743e96d-46fb-11e9-af77-0242ac110002" "labels":map["name":"nginx"]] "spec":map["replicas":'\x03' "selector":map["matchLabels":map["name":"nginx1"]] "template":map["metadata":map["creationTimestamp":<nil> "labels":map["name":"nginx1"]] "spec":map["dnsPolicy":"ClusterFirst" "securityContext":map[] "schedulerName":"default-scheduler" "containers":[map["image":"k8s.gcr.io/nginx:test-cmd" "ports":[map["protocol":"TCP" "containerPort":'P']] "resources":map[] "terminationMessagePath":"/dev/termination-log" "terminationMessagePolicy":"File" "imagePullPolicy":"IfNotPresent" "name":"nginx"]] "restartPolicy":"Always" "terminationGracePeriodSeconds":'\x1e']] "strategy":map["type":"RollingUpdate" "rollingUpdate":map["maxUnavailable":'\x01' "maxSurge":'\x01']] "revisionHistoryLimit":%!q(int64=+2147483647) "progressDeadlineSeconds":%!q(int64=+2147483647)]]}
I0315 08:25:38.591] for: "hack/testdata/deployment-label-change2.yaml": Operation cannot be fulfilled on deployments.extensions "nginx": the object has been modified; please apply your changes to the latest version and try again
I0315 08:25:38.591] has:Error from server (Conflict)
W0315 08:25:38.692] I0315 08:25:34.505577   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:25:38.692] I0315 08:25:34.505836   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:25:38.692] I0315 08:25:35.506010   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:25:38.692] I0315 08:25:35.506208   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:25:38.693] I0315 08:25:36.506533   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:25:38.693] I0315 08:25:36.506826   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
... skipping 26 lines ...
W0315 08:25:45.512] I0315 08:25:45.512038   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:25:45.513] I0315 08:25:45.512298   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:25:46.513] I0315 08:25:46.512600   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:25:46.513] I0315 08:25:46.512916   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:25:47.514] I0315 08:25:47.513198   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:25:47.514] I0315 08:25:47.513416   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:25:48.224] E0315 08:25:48.223436   58979 replica_set.go:450] Sync "namespace-1552638331-878/nginx-7bd4fbc645" failed with Operation cannot be fulfilled on replicasets.apps "nginx-7bd4fbc645": StorageError: invalid object, Code: 4, Key: /registry/replicasets/namespace-1552638331-878/nginx-7bd4fbc645, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: ecffaef7-46fb-11e9-af77-0242ac110002, UID in object meta: 
W0315 08:25:48.514] I0315 08:25:48.513690   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:25:48.514] I0315 08:25:48.513920   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:25:49.209] I0315 08:25:49.208833   58979 event.go:209] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1552638331-878", Name:"nginx", UID:"f030ee09-46fb-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"663", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-7bd4fbc645 to 3
W0315 08:25:49.215] I0315 08:25:49.214700   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638331-878", Name:"nginx-7bd4fbc645", UID:"f031eceb-46fb-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"664", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-7bd4fbc645-nzcf5
W0315 08:25:49.218] I0315 08:25:49.218058   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638331-878", Name:"nginx-7bd4fbc645", UID:"f031eceb-46fb-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"664", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-7bd4fbc645-pz75t
W0315 08:25:49.220] I0315 08:25:49.219471   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638331-878", Name:"nginx-7bd4fbc645", UID:"f031eceb-46fb-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"664", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-7bd4fbc645-xsmj9
... skipping 165 lines ...
I0315 08:25:51.167] +++ [0315 08:25:51] Creating namespace namespace-1552638351-21305
I0315 08:25:51.241] namespace/namespace-1552638351-21305 created
I0315 08:25:51.313] Context "test" modified.
I0315 08:25:51.319] +++ [0315 08:25:51] Testing kubectl get
I0315 08:25:51.407] get.sh:29: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 08:25:51.489] (BSuccessful
I0315 08:25:51.490] message:Error from server (NotFound): pods "abc" not found
I0315 08:25:51.490] has:pods "abc" not found
I0315 08:25:51.576] get.sh:37: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 08:25:51.659] (BSuccessful
I0315 08:25:51.659] message:Error from server (NotFound): pods "abc" not found
I0315 08:25:51.659] has:pods "abc" not found
I0315 08:25:51.748] get.sh:45: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 08:25:51.831] (BSuccessful
I0315 08:25:51.831] message:{
I0315 08:25:51.831]     "apiVersion": "v1",
I0315 08:25:51.831]     "items": [],
... skipping 23 lines ...
I0315 08:25:52.172] has not:No resources found
I0315 08:25:52.252] Successful
I0315 08:25:52.252] message:NAME
I0315 08:25:52.252] has not:No resources found
I0315 08:25:52.342] get.sh:73: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 08:25:52.438] (BSuccessful
I0315 08:25:52.438] message:error: the server doesn't have a resource type "foobar"
I0315 08:25:52.439] has not:No resources found
I0315 08:25:52.520] Successful
I0315 08:25:52.520] message:No resources found.
I0315 08:25:52.520] has:No resources found
I0315 08:25:52.602] Successful
I0315 08:25:52.603] message:
I0315 08:25:52.603] has not:No resources found
I0315 08:25:52.685] Successful
I0315 08:25:52.685] message:No resources found.
I0315 08:25:52.685] has:No resources found
I0315 08:25:52.773] get.sh:93: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 08:25:52.860] (BSuccessful
I0315 08:25:52.861] message:Error from server (NotFound): pods "abc" not found
I0315 08:25:52.861] has:pods "abc" not found
I0315 08:25:52.863] FAIL!
I0315 08:25:52.863] message:Error from server (NotFound): pods "abc" not found
I0315 08:25:52.863] has not:List
I0315 08:25:52.863] 99 /go/src/k8s.io/kubernetes/test/cmd/../../test/cmd/get.sh
W0315 08:25:52.964] I0315 08:25:51.515362   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:25:52.964] I0315 08:25:51.515670   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:25:52.964] I0315 08:25:52.516009   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:25:52.965] I0315 08:25:52.516211   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
... skipping 717 lines ...
I0315 08:25:56.452] }
I0315 08:25:56.531] get.sh:155: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0315 08:25:56.769] (B<no value>Successful
I0315 08:25:56.770] message:valid-pod:
I0315 08:25:56.770] has:valid-pod:
I0315 08:25:56.852] Successful
I0315 08:25:56.853] message:error: error executing jsonpath "{.missing}": Error executing template: missing is not found. Printing more information for debugging the template:
I0315 08:25:56.853] 	template was:
I0315 08:25:56.853] 		{.missing}
I0315 08:25:56.853] 	object given to jsonpath engine was:
I0315 08:25:56.854] 		map[string]interface {}{"kind":"Pod", "apiVersion":"v1", "metadata":map[string]interface {}{"namespace":"namespace-1552638355-30915", "selfLink":"/api/v1/namespaces/namespace-1552638355-30915/pods/valid-pod", "uid":"f4744cbe-46fb-11e9-af77-0242ac110002", "resourceVersion":"703", "creationTimestamp":"2019-03-15T08:25:56Z", "labels":map[string]interface {}{"name":"valid-pod"}, "managedFields":[]interface {}{map[string]interface {}{"manager":"kubectl", "operation":"Update", "apiVersion":"v1", "time":"2019-03-15T08:25:56Z", "fields":map[string]interface {}{"f:metadata":map[string]interface {}{"f:labels":map[string]interface {}{".":map[string]interface {}{}, "f:name":map[string]interface {}{}}}, "f:spec":map[string]interface {}{"f:terminationGracePeriodSeconds":map[string]interface {}{}, "f:containers":map[string]interface {}{"k:{\"name\":\"kubernetes-serve-hostname\"}":map[string]interface {}{".":map[string]interface {}{}, "f:image":map[string]interface {}{}, "f:imagePullPolicy":map[string]interface {}{}, "f:name":map[string]interface {}{}, "f:resources":map[string]interface {}{"f:requests":map[string]interface {}{".":map[string]interface {}{}, "f:cpu":map[string]interface {}{}, "f:memory":map[string]interface {}{}}, ".":map[string]interface {}{}, "f:limits":map[string]interface {}{".":map[string]interface {}{}, "f:cpu":map[string]interface {}{}, "f:memory":map[string]interface {}{}}}, "f:terminationMessagePath":map[string]interface {}{}, "f:terminationMessagePolicy":map[string]interface {}{}}}, "f:dnsPolicy":map[string]interface {}{}, "f:enableServiceLinks":map[string]interface {}{}, "f:priority":map[string]interface {}{}, "f:restartPolicy":map[string]interface {}{}, "f:schedulerName":map[string]interface {}{}, "f:securityContext":map[string]interface {}{}}}}}, "name":"valid-pod"}, "spec":map[string]interface {}{"containers":[]interface {}{map[string]interface {}{"imagePullPolicy":"Always", "name":"kubernetes-serve-hostname", "image":"k8s.gcr.io/serve_hostname", "resources":map[string]interface {}{"limits":map[string]interface {}{"memory":"512Mi", "cpu":"1"}, "requests":map[string]interface {}{"cpu":"1", "memory":"512Mi"}}, "terminationMessagePath":"/dev/termination-log", "terminationMessagePolicy":"File"}}, "restartPolicy":"Always", "terminationGracePeriodSeconds":30, "dnsPolicy":"ClusterFirst", "securityContext":map[string]interface {}{}, "schedulerName":"default-scheduler", "priority":0, "enableServiceLinks":true}, "status":map[string]interface {}{"phase":"Pending", "qosClass":"Guaranteed"}}
I0315 08:25:56.855] has:missing is not found
I0315 08:25:56.940] Successful
I0315 08:25:56.941] message:Error executing template: template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing". Printing more information for debugging the template:
I0315 08:25:56.941] 	template was:
I0315 08:25:56.941] 		{{.missing}}
I0315 08:25:56.941] 	raw data was:
I0315 08:25:56.942] 		{"apiVersion":"v1","kind":"Pod","metadata":{"creationTimestamp":"2019-03-15T08:25:56Z","labels":{"name":"valid-pod"},"managedFields":[{"apiVersion":"v1","fields":{"f:metadata":{"f:labels":{".":{},"f:name":{}}},"f:spec":{"f:containers":{"k:{\"name\":\"kubernetes-serve-hostname\"}":{".":{},"f:image":{},"f:imagePullPolicy":{},"f:name":{},"f:resources":{".":{},"f:limits":{".":{},"f:cpu":{},"f:memory":{}},"f:requests":{".":{},"f:cpu":{},"f:memory":{}}},"f:terminationMessagePath":{},"f:terminationMessagePolicy":{}}},"f:dnsPolicy":{},"f:enableServiceLinks":{},"f:priority":{},"f:restartPolicy":{},"f:schedulerName":{},"f:securityContext":{},"f:terminationGracePeriodSeconds":{}}},"manager":"kubectl","operation":"Update","time":"2019-03-15T08:25:56Z"}],"name":"valid-pod","namespace":"namespace-1552638355-30915","resourceVersion":"703","selfLink":"/api/v1/namespaces/namespace-1552638355-30915/pods/valid-pod","uid":"f4744cbe-46fb-11e9-af77-0242ac110002"},"spec":{"containers":[{"image":"k8s.gcr.io/serve_hostname","imagePullPolicy":"Always","name":"kubernetes-serve-hostname","resources":{"limits":{"cpu":"1","memory":"512Mi"},"requests":{"cpu":"1","memory":"512Mi"}},"terminationMessagePath":"/dev/termination-log","terminationMessagePolicy":"File"}],"dnsPolicy":"ClusterFirst","enableServiceLinks":true,"priority":0,"restartPolicy":"Always","schedulerName":"default-scheduler","securityContext":{},"terminationGracePeriodSeconds":30},"status":{"phase":"Pending","qosClass":"Guaranteed"}}
I0315 08:25:56.942] 	object given to template engine was:
I0315 08:25:56.943] 		map[spec:map[enableServiceLinks:true priority:0 restartPolicy:Always schedulerName:default-scheduler securityContext:map[] terminationGracePeriodSeconds:30 containers:[map[image:k8s.gcr.io/serve_hostname imagePullPolicy:Always name:kubernetes-serve-hostname resources:map[limits:map[cpu:1 memory:512Mi] requests:map[cpu:1 memory:512Mi]] terminationMessagePath:/dev/termination-log terminationMessagePolicy:File]] dnsPolicy:ClusterFirst] status:map[phase:Pending qosClass:Guaranteed] apiVersion:v1 kind:Pod metadata:map[namespace:namespace-1552638355-30915 resourceVersion:703 selfLink:/api/v1/namespaces/namespace-1552638355-30915/pods/valid-pod uid:f4744cbe-46fb-11e9-af77-0242ac110002 creationTimestamp:2019-03-15T08:25:56Z labels:map[name:valid-pod] managedFields:[map[apiVersion:v1 fields:map[f:metadata:map[f:labels:map[.:map[] f:name:map[]]] f:spec:map[f:enableServiceLinks:map[] f:priority:map[] f:restartPolicy:map[] f:schedulerName:map[] f:securityContext:map[] f:terminationGracePeriodSeconds:map[] f:containers:map[k:{"name":"kubernetes-serve-hostname"}:map[f:image:map[] f:imagePullPolicy:map[] f:name:map[] f:resources:map[f:limits:map[.:map[] f:cpu:map[] f:memory:map[]] f:requests:map[.:map[] f:cpu:map[] f:memory:map[]] .:map[]] f:terminationMessagePath:map[] f:terminationMessagePolicy:map[] .:map[]]] f:dnsPolicy:map[]]] manager:kubectl operation:Update time:2019-03-15T08:25:56Z]] name:valid-pod]]
I0315 08:25:56.943] has:map has no entry for key "missing"
W0315 08:25:57.044] I0315 08:25:56.518192   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:25:57.044] I0315 08:25:56.518394   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:25:57.044] error: error executing template "{{.missing}}": template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing"
W0315 08:25:57.519] I0315 08:25:57.518633   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:25:57.519] I0315 08:25:57.518883   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:25:58.023] E0315 08:25:58.022432   70464 streamwatcher.go:109] Unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)
I0315 08:25:58.124] Successful
I0315 08:25:58.124] message:NAME        READY   STATUS    RESTARTS   AGE
I0315 08:25:58.124] valid-pod   0/1     Pending   0          1s
... skipping 158 lines ...
I0315 08:26:00.316]   terminationGracePeriodSeconds: 30
I0315 08:26:00.316] status:
I0315 08:26:00.316]   phase: Pending
I0315 08:26:00.316]   qosClass: Guaranteed
I0315 08:26:00.316] has:name: valid-pod
I0315 08:26:00.316] Successful
I0315 08:26:00.317] message:Error from server (NotFound): pods "invalid-pod" not found
I0315 08:26:00.317] has:"invalid-pod" not found
I0315 08:26:00.372] pod "valid-pod" deleted
I0315 08:26:00.466] get.sh:193: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 08:26:00.630] (Bpod/redis-master created
I0315 08:26:00.634] pod/valid-pod created
I0315 08:26:00.729] Successful
... skipping 282 lines ...
I0315 08:26:05.942] Running command: run_create_secret_tests
I0315 08:26:05.961] 
I0315 08:26:05.963] +++ Running case: test-cmd.run_create_secret_tests 
I0315 08:26:05.965] +++ working dir: /go/src/k8s.io/kubernetes
I0315 08:26:05.967] +++ command: run_create_secret_tests
I0315 08:26:06.056] Successful
I0315 08:26:06.056] message:Error from server (NotFound): secrets "mysecret" not found
I0315 08:26:06.056] has:secrets "mysecret" not found
W0315 08:26:06.157] I0315 08:26:05.087981   55692 clientconn.go:551] parsed scheme: ""
W0315 08:26:06.157] I0315 08:26:05.088026   55692 clientconn.go:557] scheme "" not registered, fallback to default scheme
W0315 08:26:06.158] I0315 08:26:05.088084   55692 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
W0315 08:26:06.158] I0315 08:26:05.088188   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:26:06.158] I0315 08:26:05.088735   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:26:06.158] No resources found.
W0315 08:26:06.159] No resources found.
W0315 08:26:06.159] I0315 08:26:05.523016   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:26:06.159] I0315 08:26:05.523158   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 08:26:06.260] Successful
I0315 08:26:06.260] message:Error from server (NotFound): secrets "mysecret" not found
I0315 08:26:06.260] has:secrets "mysecret" not found
I0315 08:26:06.260] Successful
I0315 08:26:06.261] message:user-specified
I0315 08:26:06.261] has:user-specified
I0315 08:26:06.286] Successful
I0315 08:26:06.358] {"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-create-cm","namespace":"default","selfLink":"/api/v1/namespaces/default/configmaps/tester-create-cm","uid":"fa6ab188-46fb-11e9-af77-0242ac110002","resourceVersion":"810","creationTimestamp":"2019-03-15T08:26:06Z"}}
... skipping 178 lines ...
I0315 08:26:10.201] has:Timeout exceeded while reading body
I0315 08:26:10.279] Successful
I0315 08:26:10.279] message:NAME        READY   STATUS    RESTARTS   AGE
I0315 08:26:10.279] valid-pod   0/1     Pending   0          2s
I0315 08:26:10.279] has:valid-pod
I0315 08:26:10.349] Successful
I0315 08:26:10.349] message:error: Invalid timeout value. Timeout must be a single integer in seconds, or an integer followed by a corresponding time unit (e.g. 1s | 2m | 3h)
I0315 08:26:10.350] has:Invalid timeout value
I0315 08:26:10.427] pod "valid-pod" deleted
I0315 08:26:10.446] +++ exit code: 0
I0315 08:26:10.477] Recording: run_crd_tests
I0315 08:26:10.478] Running command: run_crd_tests
I0315 08:26:10.496] 
... skipping 223 lines ...
I0315 08:26:14.768] foo.company.com/test patched
I0315 08:26:14.863] crd.sh:237: Successful get foos/test {{.patched}}: value1
I0315 08:26:14.950] (Bfoo.company.com/test patched
I0315 08:26:15.041] crd.sh:239: Successful get foos/test {{.patched}}: value2
I0315 08:26:15.127] (Bfoo.company.com/test patched
I0315 08:26:15.220] crd.sh:241: Successful get foos/test {{.patched}}: <no value>
I0315 08:26:15.373] (B+++ [0315 08:26:15] "kubectl patch --local" returns error as expected for CustomResource: error: cannot apply strategic merge patch for company.com/v1, Kind=Foo locally, try --type merge
I0315 08:26:15.435] {
I0315 08:26:15.435]     "apiVersion": "company.com/v1",
I0315 08:26:15.435]     "kind": "Foo",
I0315 08:26:15.436]     "metadata": {
I0315 08:26:15.436]         "annotations": {
I0315 08:26:15.436]             "kubernetes.io/change-cause": "kubectl patch foos/test --server=http://127.0.0.1:8080 --match-server-version=true --patch={\"patched\":null} --type=merge --record=true"
... skipping 309 lines ...
W0315 08:26:36.541] I0315 08:26:36.540495   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:26:36.541] I0315 08:26:36.540794   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:26:37.541] I0315 08:26:37.541106   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:26:37.542] I0315 08:26:37.541363   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:26:38.542] I0315 08:26:38.541641   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:26:38.542] I0315 08:26:38.541912   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:26:38.584] E0315 08:26:38.583675   58979 resource_quota_controller.go:437] failed to sync resource monitors: [couldn't start monitor for resource "mygroup.example.com/v1alpha1, Resource=resources": unable to monitor quota for resource "mygroup.example.com/v1alpha1, Resource=resources", couldn't start monitor for resource "company.com/v1, Resource=bars": unable to monitor quota for resource "company.com/v1, Resource=bars", couldn't start monitor for resource "company.com/v1, Resource=foos": unable to monitor quota for resource "company.com/v1, Resource=foos", couldn't start monitor for resource "extensions/v1beta1, Resource=networkpolicies": unable to monitor quota for resource "extensions/v1beta1, Resource=networkpolicies", couldn't start monitor for resource "company.com/v1, Resource=validfoos": unable to monitor quota for resource "company.com/v1, Resource=validfoos"]
W0315 08:26:39.196] I0315 08:26:39.196402   58979 controller_utils.go:1027] Waiting for caches to sync for garbage collector controller
W0315 08:26:39.198] I0315 08:26:39.197820   55692 clientconn.go:551] parsed scheme: ""
W0315 08:26:39.198] I0315 08:26:39.197846   55692 clientconn.go:557] scheme "" not registered, fallback to default scheme
W0315 08:26:39.198] I0315 08:26:39.197913   55692 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
W0315 08:26:39.199] I0315 08:26:39.198006   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:26:39.199] I0315 08:26:39.198473   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
... skipping 105 lines ...
I0315 08:26:51.478] +++ [0315 08:26:51] Testing cmd with image
I0315 08:26:51.572] Successful
I0315 08:26:51.573] message:deployment.apps/test1 created
I0315 08:26:51.573] has:deployment.apps/test1 created
I0315 08:26:51.653] deployment.extensions "test1" deleted
I0315 08:26:51.729] Successful
I0315 08:26:51.730] message:error: Invalid image name "InvalidImageName": invalid reference format
I0315 08:26:51.730] has:error: Invalid image name "InvalidImageName": invalid reference format
I0315 08:26:51.743] +++ exit code: 0
I0315 08:26:51.780] +++ [0315 08:26:51] Testing recursive resources
I0315 08:26:51.785] +++ [0315 08:26:51] Creating namespace namespace-1552638411-8187
I0315 08:26:51.857] namespace/namespace-1552638411-8187 created
I0315 08:26:51.928] Context "test" modified.
I0315 08:26:52.015] generic-resources.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 08:26:52.277] (Bgeneric-resources.sh:206: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 08:26:52.279] (BSuccessful
I0315 08:26:52.280] message:pod/busybox0 created
I0315 08:26:52.280] pod/busybox1 created
I0315 08:26:52.280] error: error validating "hack/testdata/recursive/pod/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0315 08:26:52.280] has:error validating data: kind not set
I0315 08:26:52.366] generic-resources.sh:211: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 08:26:52.534] (Bgeneric-resources.sh:219: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: busybox:busybox:
I0315 08:26:52.536] (BSuccessful
I0315 08:26:52.536] message:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0315 08:26:52.536] has:Object 'Kind' is missing
I0315 08:26:52.625] generic-resources.sh:226: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 08:26:52.894] (Bgeneric-resources.sh:230: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
I0315 08:26:52.896] (BSuccessful
I0315 08:26:52.896] message:pod/busybox0 replaced
I0315 08:26:52.896] pod/busybox1 replaced
I0315 08:26:52.897] error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0315 08:26:52.897] has:error validating data: kind not set
I0315 08:26:52.986] generic-resources.sh:235: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 08:26:53.083] (BSuccessful
I0315 08:26:53.083] message:Name:               busybox0
I0315 08:26:53.083] Namespace:          namespace-1552638411-8187
I0315 08:26:53.083] Priority:           0
I0315 08:26:53.083] PriorityClassName:  <none>
... skipping 159 lines ...
I0315 08:26:53.112] has:Object 'Kind' is missing
I0315 08:26:53.183] generic-resources.sh:245: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 08:26:53.370] (Bgeneric-resources.sh:249: Successful get pods {{range.items}}{{.metadata.annotations.annotatekey}}:{{end}}: annotatevalue:annotatevalue:
I0315 08:26:53.372] (BSuccessful
I0315 08:26:53.372] message:pod/busybox0 annotated
I0315 08:26:53.372] pod/busybox1 annotated
I0315 08:26:53.372] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0315 08:26:53.373] has:Object 'Kind' is missing
I0315 08:26:53.464] generic-resources.sh:254: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 08:26:53.738] (Bgeneric-resources.sh:258: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
I0315 08:26:53.740] (BSuccessful
I0315 08:26:53.740] message:Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
I0315 08:26:53.740] pod/busybox0 configured
I0315 08:26:53.741] Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
I0315 08:26:53.741] pod/busybox1 configured
I0315 08:26:53.741] error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0315 08:26:53.741] has:error validating data: kind not set
I0315 08:26:53.832] generic-resources.sh:264: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 08:26:54.011] (Bdeployment.apps/nginx created
W0315 08:26:54.112] Error from server (NotFound): namespaces "non-native-resources" not found
W0315 08:26:54.112] kubectl run --generator=deployment/apps.v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
W0315 08:26:54.112] I0315 08:26:51.549232   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:26:54.112] I0315 08:26:51.549421   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:26:54.113] I0315 08:26:51.561821   58979 event.go:209] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1552638411-23415", Name:"test1", UID:"155b7dcb-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"970", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test1-848d5d4b47 to 1
W0315 08:26:54.113] I0315 08:26:51.567962   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638411-23415", Name:"test1-848d5d4b47", UID:"155c3c81-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"971", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test1-848d5d4b47-pzvr2
W0315 08:26:54.113] I0315 08:26:52.549674   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
... skipping 53 lines ...
I0315 08:26:54.473] deployment.extensions "nginx" deleted
I0315 08:26:54.573] generic-resources.sh:280: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 08:26:54.742] (Bgeneric-resources.sh:284: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 08:26:54.744] (BSuccessful
I0315 08:26:54.745] message:kubectl convert is DEPRECATED and will be removed in a future version.
I0315 08:26:54.745] In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
I0315 08:26:54.745] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0315 08:26:54.746] has:Object 'Kind' is missing
I0315 08:26:54.837] generic-resources.sh:289: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 08:26:54.926] (BSuccessful
I0315 08:26:54.926] message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0315 08:26:54.926] has:busybox0:busybox1:
I0315 08:26:54.928] Successful
I0315 08:26:54.928] message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0315 08:26:54.928] has:Object 'Kind' is missing
I0315 08:26:55.019] generic-resources.sh:298: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 08:26:55.116] (Bpod/busybox0 labeled pod/busybox1 labeled error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0315 08:26:55.208] generic-resources.sh:303: Successful get pods {{range.items}}{{.metadata.labels.mylabel}}:{{end}}: myvalue:myvalue:
I0315 08:26:55.210] (BSuccessful
I0315 08:26:55.211] message:pod/busybox0 labeled
I0315 08:26:55.211] pod/busybox1 labeled
I0315 08:26:55.211] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0315 08:26:55.211] has:Object 'Kind' is missing
I0315 08:26:55.302] generic-resources.sh:308: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 08:26:55.388] (Bpod/busybox0 patched pod/busybox1 patched error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0315 08:26:55.474] generic-resources.sh:313: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: prom/busybox:prom/busybox:
I0315 08:26:55.476] (BSuccessful
I0315 08:26:55.476] message:pod/busybox0 patched
I0315 08:26:55.477] pod/busybox1 patched
I0315 08:26:55.477] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0315 08:26:55.477] has:Object 'Kind' is missing
I0315 08:26:55.565] generic-resources.sh:318: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 08:26:55.751] (Bgeneric-resources.sh:322: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 08:26:55.753] (BSuccessful
I0315 08:26:55.754] message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
I0315 08:26:55.754] pod "busybox0" force deleted
I0315 08:26:55.754] pod "busybox1" force deleted
I0315 08:26:55.754] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0315 08:26:55.754] has:Object 'Kind' is missing
I0315 08:26:55.845] generic-resources.sh:327: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 08:26:56.004] (Breplicationcontroller/busybox0 created
I0315 08:26:56.010] replicationcontroller/busybox1 created
W0315 08:26:56.110] kubectl convert is DEPRECATED and will be removed in a future version.
W0315 08:26:56.111] In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
W0315 08:26:56.111] I0315 08:26:54.550765   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:26:56.111] I0315 08:26:54.550981   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:26:56.112] I0315 08:26:55.551251   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:26:56.112] I0315 08:26:55.551449   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:26:56.112] I0315 08:26:55.662455   58979 namespace_controller.go:171] Namespace has been deleted non-native-resources
W0315 08:26:56.112] error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
W0315 08:26:56.113] I0315 08:26:56.010318   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552638411-8187", Name:"busybox0", UID:"180240e4-46fc-11e9-af77-0242ac110002", APIVersion:"v1", ResourceVersion:"1027", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-xzn4f
W0315 08:26:56.113] I0315 08:26:56.014838   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552638411-8187", Name:"busybox1", UID:"180323b4-46fc-11e9-af77-0242ac110002", APIVersion:"v1", ResourceVersion:"1029", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-bkjhv
I0315 08:26:56.213] generic-resources.sh:331: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 08:26:56.214] (Bgeneric-resources.sh:336: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 08:26:56.298] (Bgeneric-resources.sh:337: Successful get rc busybox0 {{.spec.replicas}}: 1
I0315 08:26:56.384] (Bgeneric-resources.sh:338: Successful get rc busybox1 {{.spec.replicas}}: 1
I0315 08:26:56.556] (Bgeneric-resources.sh:343: Successful get hpa busybox0 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
I0315 08:26:56.641] (Bgeneric-resources.sh:344: Successful get hpa busybox1 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
I0315 08:26:56.643] (BSuccessful
I0315 08:26:56.644] message:horizontalpodautoscaler.autoscaling/busybox0 autoscaled
I0315 08:26:56.644] horizontalpodautoscaler.autoscaling/busybox1 autoscaled
I0315 08:26:56.644] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0315 08:26:56.644] has:Object 'Kind' is missing
I0315 08:26:56.722] horizontalpodautoscaler.autoscaling "busybox0" deleted
I0315 08:26:56.807] horizontalpodautoscaler.autoscaling "busybox1" deleted
I0315 08:26:56.906] generic-resources.sh:352: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 08:26:56.995] (Bgeneric-resources.sh:353: Successful get rc busybox0 {{.spec.replicas}}: 1
I0315 08:26:57.087] (Bgeneric-resources.sh:354: Successful get rc busybox1 {{.spec.replicas}}: 1
I0315 08:26:57.284] (Bgeneric-resources.sh:358: Successful get service busybox0 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
I0315 08:26:57.373] (Bgeneric-resources.sh:359: Successful get service busybox1 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
I0315 08:26:57.375] (BSuccessful
I0315 08:26:57.375] message:service/busybox0 exposed
I0315 08:26:57.375] service/busybox1 exposed
I0315 08:26:57.376] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0315 08:26:57.376] has:Object 'Kind' is missing
I0315 08:26:57.466] generic-resources.sh:365: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 08:26:57.554] (Bgeneric-resources.sh:366: Successful get rc busybox0 {{.spec.replicas}}: 1
I0315 08:26:57.651] (Bgeneric-resources.sh:367: Successful get rc busybox1 {{.spec.replicas}}: 1
I0315 08:26:57.882] (Bgeneric-resources.sh:371: Successful get rc busybox0 {{.spec.replicas}}: 2
I0315 08:26:57.970] (Bgeneric-resources.sh:372: Successful get rc busybox1 {{.spec.replicas}}: 2
I0315 08:26:57.972] (BSuccessful
I0315 08:26:57.972] message:replicationcontroller/busybox0 scaled
I0315 08:26:57.972] replicationcontroller/busybox1 scaled
I0315 08:26:57.972] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0315 08:26:57.973] has:Object 'Kind' is missing
I0315 08:26:58.059] generic-resources.sh:377: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 08:26:58.233] (Bgeneric-resources.sh:381: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 08:26:58.235] (BSuccessful
I0315 08:26:58.236] message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
I0315 08:26:58.236] replicationcontroller "busybox0" force deleted
I0315 08:26:58.236] replicationcontroller "busybox1" force deleted
I0315 08:26:58.237] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0315 08:26:58.237] has:Object 'Kind' is missing
I0315 08:26:58.326] generic-resources.sh:386: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 08:26:58.482] (Bdeployment.apps/nginx1-deployment created
I0315 08:26:58.487] deployment.apps/nginx0-deployment created
W0315 08:26:58.587] I0315 08:26:56.551724   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:26:58.588] I0315 08:26:56.551956   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:26:58.588] I0315 08:26:57.552246   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:26:58.588] I0315 08:26:57.552490   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:26:58.589] I0315 08:26:57.766664   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552638411-8187", Name:"busybox0", UID:"180240e4-46fc-11e9-af77-0242ac110002", APIVersion:"v1", ResourceVersion:"1048", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-qb74x
W0315 08:26:58.589] I0315 08:26:57.779959   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552638411-8187", Name:"busybox1", UID:"180323b4-46fc-11e9-af77-0242ac110002", APIVersion:"v1", ResourceVersion:"1051", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-zpcpg
W0315 08:26:58.589] error: error validating "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
W0315 08:26:58.589] I0315 08:26:58.489103   58979 event.go:209] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1552638411-8187", Name:"nginx1-deployment", UID:"197c39c5-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1069", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx1-deployment-7c76c6cbb8 to 2
W0315 08:26:58.590] I0315 08:26:58.493555   58979 event.go:209] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1552638411-8187", Name:"nginx0-deployment", UID:"197d18c0-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1070", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx0-deployment-7bb85585d7 to 2
W0315 08:26:58.590] I0315 08:26:58.494184   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638411-8187", Name:"nginx1-deployment-7c76c6cbb8", UID:"197d34ae-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1071", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-7c76c6cbb8-6qnjp
W0315 08:26:58.590] I0315 08:26:58.498223   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638411-8187", Name:"nginx0-deployment-7bb85585d7", UID:"197dccfb-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1072", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-7bb85585d7-b9b9r
W0315 08:26:58.591] I0315 08:26:58.499989   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638411-8187", Name:"nginx1-deployment-7c76c6cbb8", UID:"197d34ae-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1071", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-7c76c6cbb8-t6m6l
W0315 08:26:58.591] I0315 08:26:58.504932   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638411-8187", Name:"nginx0-deployment-7bb85585d7", UID:"197dccfb-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1072", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-7bb85585d7-5sqmq
... skipping 2 lines ...
I0315 08:26:58.692] generic-resources.sh:390: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx0-deployment:nginx1-deployment:
I0315 08:26:58.702] (Bgeneric-resources.sh:391: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
I0315 08:26:58.917] (Bgeneric-resources.sh:395: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
I0315 08:26:58.919] (BSuccessful
I0315 08:26:58.919] message:deployment.apps/nginx1-deployment skipped rollback (current template already matches revision 1)
I0315 08:26:58.919] deployment.apps/nginx0-deployment skipped rollback (current template already matches revision 1)
I0315 08:26:58.920] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I0315 08:26:58.920] has:Object 'Kind' is missing
I0315 08:26:59.021] deployment.apps/nginx1-deployment paused
I0315 08:26:59.032] deployment.apps/nginx0-deployment paused
I0315 08:26:59.147] generic-resources.sh:402: Successful get deployment {{range.items}}{{.spec.paused}}:{{end}}: true:true:
I0315 08:26:59.149] (BSuccessful
I0315 08:26:59.150] message:unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
... skipping 10 lines ...
I0315 08:26:59.500] 1         <none>
I0315 08:26:59.501] 
I0315 08:26:59.501] deployment.apps/nginx0-deployment 
I0315 08:26:59.501] REVISION  CHANGE-CAUSE
I0315 08:26:59.501] 1         <none>
I0315 08:26:59.501] 
I0315 08:26:59.501] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I0315 08:26:59.502] has:nginx0-deployment
I0315 08:26:59.502] Successful
I0315 08:26:59.502] message:deployment.apps/nginx1-deployment 
I0315 08:26:59.503] REVISION  CHANGE-CAUSE
I0315 08:26:59.503] 1         <none>
I0315 08:26:59.503] 
I0315 08:26:59.503] deployment.apps/nginx0-deployment 
I0315 08:26:59.503] REVISION  CHANGE-CAUSE
I0315 08:26:59.503] 1         <none>
I0315 08:26:59.503] 
I0315 08:26:59.504] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I0315 08:26:59.504] has:nginx1-deployment
I0315 08:26:59.504] Successful
I0315 08:26:59.505] message:deployment.apps/nginx1-deployment 
I0315 08:26:59.505] REVISION  CHANGE-CAUSE
I0315 08:26:59.505] 1         <none>
I0315 08:26:59.505] 
I0315 08:26:59.505] deployment.apps/nginx0-deployment 
I0315 08:26:59.505] REVISION  CHANGE-CAUSE
I0315 08:26:59.506] 1         <none>
I0315 08:26:59.506] 
I0315 08:26:59.506] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I0315 08:26:59.506] has:Object 'Kind' is missing
I0315 08:26:59.586] deployment.apps "nginx1-deployment" force deleted
I0315 08:26:59.592] deployment.apps "nginx0-deployment" force deleted
W0315 08:26:59.693] I0315 08:26:59.553548   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:26:59.694] I0315 08:26:59.553890   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:26:59.694] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0315 08:26:59.694] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
W0315 08:27:00.555] I0315 08:27:00.554234   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:27:00.555] I0315 08:27:00.554443   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 08:27:00.689] generic-resources.sh:424: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 08:27:00.846] (Breplicationcontroller/busybox0 created
I0315 08:27:00.851] replicationcontroller/busybox1 created
W0315 08:27:00.952] error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
W0315 08:27:00.952] I0315 08:27:00.852001   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552638411-8187", Name:"busybox0", UID:"1ae5006f-46fc-11e9-af77-0242ac110002", APIVersion:"v1", ResourceVersion:"1118", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-jh867
W0315 08:27:00.953] I0315 08:27:00.855828   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552638411-8187", Name:"busybox1", UID:"1ae5d1f3-46fc-11e9-af77-0242ac110002", APIVersion:"v1", ResourceVersion:"1119", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-m7t7l
I0315 08:27:01.053] generic-resources.sh:428: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 08:27:01.053] (BSuccessful
I0315 08:27:01.054] message:no rollbacker has been implemented for "ReplicationController"
I0315 08:27:01.054] no rollbacker has been implemented for "ReplicationController"
... skipping 3 lines ...
I0315 08:27:01.055] message:no rollbacker has been implemented for "ReplicationController"
I0315 08:27:01.055] no rollbacker has been implemented for "ReplicationController"
I0315 08:27:01.055] unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0315 08:27:01.055] has:Object 'Kind' is missing
I0315 08:27:01.140] Successful
I0315 08:27:01.140] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0315 08:27:01.141] error: replicationcontrollers "busybox0" pausing is not supported
I0315 08:27:01.141] error: replicationcontrollers "busybox1" pausing is not supported
I0315 08:27:01.141] has:Object 'Kind' is missing
I0315 08:27:01.142] Successful
I0315 08:27:01.143] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0315 08:27:01.143] error: replicationcontrollers "busybox0" pausing is not supported
I0315 08:27:01.143] error: replicationcontrollers "busybox1" pausing is not supported
I0315 08:27:01.143] has:replicationcontrollers "busybox0" pausing is not supported
I0315 08:27:01.144] Successful
I0315 08:27:01.144] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0315 08:27:01.145] error: replicationcontrollers "busybox0" pausing is not supported
I0315 08:27:01.145] error: replicationcontrollers "busybox1" pausing is not supported
I0315 08:27:01.145] has:replicationcontrollers "busybox1" pausing is not supported
I0315 08:27:01.235] Successful
I0315 08:27:01.235] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0315 08:27:01.236] error: replicationcontrollers "busybox0" resuming is not supported
I0315 08:27:01.236] error: replicationcontrollers "busybox1" resuming is not supported
I0315 08:27:01.236] has:Object 'Kind' is missing
I0315 08:27:01.237] Successful
I0315 08:27:01.237] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0315 08:27:01.237] error: replicationcontrollers "busybox0" resuming is not supported
I0315 08:27:01.238] error: replicationcontrollers "busybox1" resuming is not supported
I0315 08:27:01.238] has:replicationcontrollers "busybox0" resuming is not supported
I0315 08:27:01.239] Successful
I0315 08:27:01.240] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0315 08:27:01.240] error: replicationcontrollers "busybox0" resuming is not supported
I0315 08:27:01.240] error: replicationcontrollers "busybox1" resuming is not supported
I0315 08:27:01.240] has:replicationcontrollers "busybox0" resuming is not supported
I0315 08:27:01.319] replicationcontroller "busybox0" force deleted
I0315 08:27:01.324] replicationcontroller "busybox1" force deleted
W0315 08:27:01.425] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0315 08:27:01.425] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
W0315 08:27:01.555] I0315 08:27:01.554820   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:27:01.556] I0315 08:27:01.555146   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 08:27:02.331] Recording: run_namespace_tests
I0315 08:27:02.332] Running command: run_namespace_tests
I0315 08:27:02.351] 
I0315 08:27:02.353] +++ Running case: test-cmd.run_namespace_tests 
... skipping 14 lines ...
W0315 08:27:06.558] I0315 08:27:06.557894   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:27:06.559] I0315 08:27:06.558094   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:27:07.559] I0315 08:27:07.558345   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:27:07.559] I0315 08:27:07.558563   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 08:27:07.732] namespace/my-namespace condition met
I0315 08:27:07.824] Successful
I0315 08:27:07.824] message:Error from server (NotFound): namespaces "my-namespace" not found
I0315 08:27:07.824] has: not found
I0315 08:27:07.928] core.sh:1336: Successful get namespaces {{range.items}}{{ if eq $id_field \"other\" }}found{{end}}{{end}}:: :
I0315 08:27:08.001] (Bnamespace/other created
I0315 08:27:08.093] core.sh:1340: Successful get namespaces/other {{.metadata.name}}: other
I0315 08:27:08.183] (Bcore.sh:1344: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 08:27:08.346] (Bpod/valid-pod created
I0315 08:27:08.446] core.sh:1348: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0315 08:27:08.542] (Bcore.sh:1350: Successful get pods -n other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0315 08:27:08.631] (BSuccessful
I0315 08:27:08.632] message:error: a resource cannot be retrieved by name across all namespaces
I0315 08:27:08.632] has:a resource cannot be retrieved by name across all namespaces
I0315 08:27:08.730] core.sh:1357: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0315 08:27:08.816] (Bpod "valid-pod" force deleted
I0315 08:27:08.915] core.sh:1361: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 08:27:08.992] (Bnamespace "other" deleted
W0315 08:27:09.092] I0315 08:27:08.558840   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:27:09.093] I0315 08:27:08.559103   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:27:09.093] E0315 08:27:08.786320   58979 resource_quota_controller.go:437] failed to sync resource monitors: couldn't start monitor for resource "extensions/v1beta1, Resource=networkpolicies": unable to monitor quota for resource "extensions/v1beta1, Resource=networkpolicies"
W0315 08:27:09.093] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0315 08:27:09.500] I0315 08:27:09.499590   58979 controller_utils.go:1027] Waiting for caches to sync for garbage collector controller
W0315 08:27:09.560] I0315 08:27:09.559444   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:27:09.560] I0315 08:27:09.559794   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:27:09.600] I0315 08:27:09.599972   58979 controller_utils.go:1034] Caches are synced for garbage collector controller
W0315 08:27:10.560] I0315 08:27:10.560104   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
... skipping 152 lines ...
I0315 08:27:29.507] +++ command: run_client_config_tests
I0315 08:27:29.520] +++ [0315 08:27:29] Creating namespace namespace-1552638449-4446
I0315 08:27:29.602] namespace/namespace-1552638449-4446 created
I0315 08:27:29.687] Context "test" modified.
I0315 08:27:29.695] +++ [0315 08:27:29] Testing client config
I0315 08:27:29.787] Successful
I0315 08:27:29.787] message:error: stat missing: no such file or directory
I0315 08:27:29.787] has:missing: no such file or directory
I0315 08:27:29.868] Successful
I0315 08:27:29.868] message:error: stat missing: no such file or directory
I0315 08:27:29.868] has:missing: no such file or directory
I0315 08:27:29.948] Successful
I0315 08:27:29.948] message:error: stat missing: no such file or directory
I0315 08:27:29.949] has:missing: no such file or directory
I0315 08:27:30.026] Successful
I0315 08:27:30.027] message:Error in configuration: context was not found for specified context: missing-context
I0315 08:27:30.027] has:context was not found for specified context: missing-context
I0315 08:27:30.102] Successful
I0315 08:27:30.102] message:error: no server found for cluster "missing-cluster"
I0315 08:27:30.103] has:no server found for cluster "missing-cluster"
I0315 08:27:30.181] Successful
I0315 08:27:30.182] message:error: auth info "missing-user" does not exist
I0315 08:27:30.182] has:auth info "missing-user" does not exist
W0315 08:27:30.283] I0315 08:27:29.571238   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:27:30.283] I0315 08:27:29.571516   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 08:27:30.384] Successful
I0315 08:27:30.384] message:error: Error loading config file "/tmp/newconfig.yaml": no kind "Config" is registered for version "v-1" in scheme "k8s.io/client-go/tools/clientcmd/api/latest/latest.go:50"
I0315 08:27:30.385] has:Error loading config file
I0315 08:27:30.414] Successful
I0315 08:27:30.414] message:error: stat missing-config: no such file or directory
I0315 08:27:30.415] has:no such file or directory
I0315 08:27:30.430] +++ exit code: 0
I0315 08:27:30.467] Recording: run_service_accounts_tests
I0315 08:27:30.467] Running command: run_service_accounts_tests
I0315 08:27:30.490] 
I0315 08:27:30.492] +++ Running case: test-cmd.run_service_accounts_tests 
... skipping 48 lines ...
I0315 08:27:37.496] Labels:                        run=pi
I0315 08:27:37.496] Annotations:                   <none>
I0315 08:27:37.496] Schedule:                      59 23 31 2 *
I0315 08:27:37.496] Concurrency Policy:            Allow
I0315 08:27:37.496] Suspend:                       False
I0315 08:27:37.496] Successful Job History Limit:  824643002680
I0315 08:27:37.496] Failed Job History Limit:      1
I0315 08:27:37.497] Starting Deadline Seconds:     <unset>
I0315 08:27:37.497] Selector:                      <unset>
I0315 08:27:37.497] Parallelism:                   <unset>
I0315 08:27:37.497] Completions:                   <unset>
I0315 08:27:37.497] Pod Template:
I0315 08:27:37.497]   Labels:  run=pi
... skipping 31 lines ...
I0315 08:27:38.099]                 job-name=test-job
I0315 08:27:38.099]                 run=pi
I0315 08:27:38.100] Annotations:    cronjob.kubernetes.io/instantiate: manual
I0315 08:27:38.100] Parallelism:    1
I0315 08:27:38.100] Completions:    1
I0315 08:27:38.100] Start Time:     Fri, 15 Mar 2019 08:27:37 +0000
I0315 08:27:38.100] Pods Statuses:  1 Running / 0 Succeeded / 0 Failed
I0315 08:27:38.101] Pod Template:
I0315 08:27:38.101]   Labels:  controller-uid=30ec15b1-46fc-11e9-af77-0242ac110002
I0315 08:27:38.101]            job-name=test-job
I0315 08:27:38.101]            run=pi
I0315 08:27:38.101]   Containers:
I0315 08:27:38.101]    pi:
... skipping 411 lines ...
I0315 08:27:47.847]   sessionAffinity: None
I0315 08:27:47.847]   type: ClusterIP
I0315 08:27:47.847] status:
I0315 08:27:47.847]   loadBalancer: {}
W0315 08:27:47.948] I0315 08:27:47.582985   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:27:47.948] I0315 08:27:47.583182   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:27:47.948] error: you must specify resources by --filename when --local is set.
W0315 08:27:47.949] Example resource specifications include:
W0315 08:27:47.949]    '-f rsrc.yaml'
W0315 08:27:47.949]    '--filename=rsrc.json'
I0315 08:27:48.049] core.sh:886: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend:
I0315 08:27:48.183] (Bcore.sh:893: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
I0315 08:27:48.269] (Bservice "redis-master" deleted
... skipping 107 lines ...
I0315 08:27:55.061] (Bapps.sh:80: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0315 08:27:55.148] (Bapps.sh:81: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
I0315 08:27:55.251] (Bdaemonset.extensions/bind rolled back
I0315 08:27:55.354] apps.sh:84: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
I0315 08:27:55.444] (Bapps.sh:85: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I0315 08:27:55.547] (BSuccessful
I0315 08:27:55.548] message:error: unable to find specified revision 1000000 in history
I0315 08:27:55.548] has:unable to find specified revision
I0315 08:27:55.635] apps.sh:89: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
I0315 08:27:55.728] (Bapps.sh:90: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I0315 08:27:55.833] (Bdaemonset.extensions/bind rolled back
I0315 08:27:55.933] apps.sh:93: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
I0315 08:27:56.027] (Bapps.sh:94: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
... skipping 19 lines ...
W0315 08:27:57.258] I0315 08:27:50.584835   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:27:57.258] I0315 08:27:50.585017   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:27:57.258] kubectl run --generator=deployment/apps.v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
W0315 08:27:57.259] I0315 08:27:51.217793   58979 event.go:209] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"default", Name:"testmetadata", UID:"38ea0b55-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1306", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set testmetadata-74fbc8d655 to 2
W0315 08:27:57.259] I0315 08:27:51.225275   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"default", Name:"testmetadata-74fbc8d655", UID:"38eaf508-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1307", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: testmetadata-74fbc8d655-lqxvm
W0315 08:27:57.259] I0315 08:27:51.229673   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"default", Name:"testmetadata-74fbc8d655", UID:"38eaf508-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1307", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: testmetadata-74fbc8d655-d46xl
W0315 08:27:57.260] E0315 08:27:51.233821   58979 event.go:247] Could not construct reference to: '&v1.Endpoints{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"testmetadata", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string{}, OwnerReferences:[]v1.OwnerReference(nil), Initializers:(*v1.Initializers)(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Subsets:[]v1.EndpointSubset{}}' due to: 'selfLink was empty, can't make reference'. Will not report event: 'Warning' 'FailedToCreateEndpoint' 'Failed to create endpoint for service default/testmetadata: endpoints "testmetadata" already exists'
W0315 08:27:57.260] I0315 08:27:51.585192   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:27:57.260] I0315 08:27:51.585362   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:27:57.261] I0315 08:27:52.585698   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:27:57.261] I0315 08:27:52.585956   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:27:57.261] I0315 08:27:52.760818   55692 controller.go:606] quota admission added evaluator for: daemonsets.extensions
W0315 08:27:57.261] I0315 08:27:53.586297   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:27:57.261] I0315 08:27:53.586599   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:27:57.261] I0315 08:27:54.586822   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:27:57.262] I0315 08:27:54.587027   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:27:57.265] E0315 08:27:55.283627   58979 daemon_controller.go:302] namespace-1552638473-3892/bind failed with : error storing status for daemon set &v1.DaemonSet{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"bind", GenerateName:"", Namespace:"namespace-1552638473-3892", SelfLink:"/apis/apps/v1/namespaces/namespace-1552638473-3892/daemonsets/bind", UID:"3a784b8b-46fc-11e9-af77-0242ac110002", ResourceVersion:"1368", Generation:3, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63688235273, loc:(*time.Location)(0x6abbc40)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true", "deprecated.daemonset.template.generation":"3", "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"DaemonSet\",\"metadata\":{\"annotations\":{\"kubernetes.io/change-cause\":\"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true\"},\"labels\":{\"service\":\"bind\"},\"name\":\"bind\",\"namespace\":\"namespace-1552638473-3892\"},\"spec\":{\"selector\":{\"matchLabels\":{\"service\":\"bind\"}},\"template\":{\"metadata\":{\"labels\":{\"service\":\"bind\"}},\"spec\":{\"affinity\":{\"podAntiAffinity\":{\"requiredDuringSchedulingIgnoredDuringExecution\":[{\"labelSelector\":{\"matchExpressions\":[{\"key\":\"service\",\"operator\":\"In\",\"values\":[\"bind\"]}]},\"namespaces\":[],\"topologyKey\":\"kubernetes.io/hostname\"}]}},\"containers\":[{\"image\":\"k8s.gcr.io/pause:latest\",\"name\":\"kubernetes-pause\"},{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"app\"}]}},\"updateStrategy\":{\"rollingUpdate\":{\"maxUnavailable\":\"10%\"},\"type\":\"RollingUpdate\"}}}\n"}, OwnerReferences:[]v1.OwnerReference(nil), Initializers:(*v1.Initializers)(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"kubectl", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc001e8cda0), Fields:(*v1.Fields)(0xc002fc8888)}, v1.ManagedFieldsEntry{Manager:"kube-controller-manager", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc001e8d460), Fields:(*v1.Fields)(0xc002fc8920)}, v1.ManagedFieldsEntry{Manager:"kubectl", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc001e8d500), Fields:(*v1.Fields)(0xc002fc8948)}, v1.ManagedFieldsEntry{Manager:"kubectl", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc001e8d660), Fields:(*v1.Fields)(0xc002fc8978)}}}, Spec:v1.DaemonSetSpec{Selector:(*v1.LabelSelector)(0xc001e8d840), Template:v1.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Initializers:(*v1.Initializers)(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"kubernetes-pause", Image:"k8s.gcr.io/pause:2.0", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc002a75f78), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc00245f8c0), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(0xc001e8d880), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration(nil), HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(nil)}}, UpdateStrategy:v1.DaemonSetUpdateStrategy{Type:"RollingUpdate", RollingUpdate:(*v1.RollingUpdateDaemonSet)(0xc002fc89e0)}, MinReadySeconds:0, RevisionHistoryLimit:(*int32)(0xc002a75ff0)}, Status:v1.DaemonSetStatus{CurrentNumberScheduled:0, NumberMisscheduled:0, DesiredNumberScheduled:0, NumberReady:0, ObservedGeneration:2, UpdatedNumberScheduled:0, NumberAvailable:0, NumberUnavailable:0, CollisionCount:(*int32)(nil), Conditions:[]v1.DaemonSetCondition(nil)}}: Operation cannot be fulfilled on daemonsets.apps "bind": the object has been modified; please apply your changes to the latest version and try again
W0315 08:27:57.265] I0315 08:27:55.587434   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:27:57.265] I0315 08:27:55.587610   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:27:57.265] I0315 08:27:56.587924   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:27:57.265] I0315 08:27:56.588153   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:27:57.266] I0315 08:27:56.704426   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552638476-5734", Name:"frontend", UID:"3c2f2337-46fc-11e9-af77-0242ac110002", APIVersion:"v1", ResourceVersion:"1380", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-srqdg
W0315 08:27:57.266] I0315 08:27:56.709036   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552638476-5734", Name:"frontend", UID:"3c2f2337-46fc-11e9-af77-0242ac110002", APIVersion:"v1", ResourceVersion:"1380", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-c9sg9
... skipping 7 lines ...
I0315 08:27:57.439] Namespace:    namespace-1552638476-5734
I0315 08:27:57.439] Selector:     app=guestbook,tier=frontend
I0315 08:27:57.439] Labels:       app=guestbook
I0315 08:27:57.439]               tier=frontend
I0315 08:27:57.439] Annotations:  <none>
I0315 08:27:57.439] Replicas:     3 current / 3 desired
I0315 08:27:57.440] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 08:27:57.440] Pod Template:
I0315 08:27:57.440]   Labels:  app=guestbook
I0315 08:27:57.440]            tier=frontend
I0315 08:27:57.440]   Containers:
I0315 08:27:57.440]    php-redis:
I0315 08:27:57.440]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
I0315 08:27:57.562] Namespace:    namespace-1552638476-5734
I0315 08:27:57.562] Selector:     app=guestbook,tier=frontend
I0315 08:27:57.563] Labels:       app=guestbook
I0315 08:27:57.563]               tier=frontend
I0315 08:27:57.563] Annotations:  <none>
I0315 08:27:57.563] Replicas:     3 current / 3 desired
I0315 08:27:57.563] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 08:27:57.563] Pod Template:
I0315 08:27:57.563]   Labels:  app=guestbook
I0315 08:27:57.563]            tier=frontend
I0315 08:27:57.563]   Containers:
I0315 08:27:57.564]    php-redis:
I0315 08:27:57.564]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 20 lines ...
I0315 08:27:57.767] Namespace:    namespace-1552638476-5734
I0315 08:27:57.767] Selector:     app=guestbook,tier=frontend
I0315 08:27:57.767] Labels:       app=guestbook
I0315 08:27:57.768]               tier=frontend
I0315 08:27:57.768] Annotations:  <none>
I0315 08:27:57.768] Replicas:     3 current / 3 desired
I0315 08:27:57.768] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 08:27:57.768] Pod Template:
I0315 08:27:57.768]   Labels:  app=guestbook
I0315 08:27:57.768]            tier=frontend
I0315 08:27:57.768]   Containers:
I0315 08:27:57.768]    php-redis:
I0315 08:27:57.769]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 12 lines ...
I0315 08:27:57.804] Namespace:    namespace-1552638476-5734
I0315 08:27:57.804] Selector:     app=guestbook,tier=frontend
I0315 08:27:57.804] Labels:       app=guestbook
I0315 08:27:57.804]               tier=frontend
I0315 08:27:57.804] Annotations:  <none>
I0315 08:27:57.804] Replicas:     3 current / 3 desired
I0315 08:27:57.804] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 08:27:57.804] Pod Template:
I0315 08:27:57.805]   Labels:  app=guestbook
I0315 08:27:57.805]            tier=frontend
I0315 08:27:57.805]   Containers:
I0315 08:27:57.805]    php-redis:
I0315 08:27:57.805]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 18 lines ...
I0315 08:27:57.957] Namespace:    namespace-1552638476-5734
I0315 08:27:57.957] Selector:     app=guestbook,tier=frontend
I0315 08:27:57.957] Labels:       app=guestbook
I0315 08:27:57.957]               tier=frontend
I0315 08:27:57.957] Annotations:  <none>
I0315 08:27:57.957] Replicas:     3 current / 3 desired
I0315 08:27:57.958] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 08:27:57.958] Pod Template:
I0315 08:27:57.958]   Labels:  app=guestbook
I0315 08:27:57.958]            tier=frontend
I0315 08:27:57.958]   Containers:
I0315 08:27:57.958]    php-redis:
I0315 08:27:57.958]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
I0315 08:27:58.067] Namespace:    namespace-1552638476-5734
I0315 08:27:58.067] Selector:     app=guestbook,tier=frontend
I0315 08:27:58.067] Labels:       app=guestbook
I0315 08:27:58.067]               tier=frontend
I0315 08:27:58.067] Annotations:  <none>
I0315 08:27:58.067] Replicas:     3 current / 3 desired
I0315 08:27:58.068] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 08:27:58.068] Pod Template:
I0315 08:27:58.068]   Labels:  app=guestbook
I0315 08:27:58.068]            tier=frontend
I0315 08:27:58.068]   Containers:
I0315 08:27:58.068]    php-redis:
I0315 08:27:58.068]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
I0315 08:27:58.171] Namespace:    namespace-1552638476-5734
I0315 08:27:58.171] Selector:     app=guestbook,tier=frontend
I0315 08:27:58.171] Labels:       app=guestbook
I0315 08:27:58.171]               tier=frontend
I0315 08:27:58.172] Annotations:  <none>
I0315 08:27:58.172] Replicas:     3 current / 3 desired
I0315 08:27:58.172] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 08:27:58.172] Pod Template:
I0315 08:27:58.172]   Labels:  app=guestbook
I0315 08:27:58.172]            tier=frontend
I0315 08:27:58.172]   Containers:
I0315 08:27:58.172]    php-redis:
I0315 08:27:58.172]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 11 lines ...
I0315 08:27:58.288] Namespace:    namespace-1552638476-5734
I0315 08:27:58.288] Selector:     app=guestbook,tier=frontend
I0315 08:27:58.288] Labels:       app=guestbook
I0315 08:27:58.288]               tier=frontend
I0315 08:27:58.288] Annotations:  <none>
I0315 08:27:58.288] Replicas:     3 current / 3 desired
I0315 08:27:58.288] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 08:27:58.288] Pod Template:
I0315 08:27:58.289]   Labels:  app=guestbook
I0315 08:27:58.289]            tier=frontend
I0315 08:27:58.289]   Containers:
I0315 08:27:58.289]    php-redis:
I0315 08:27:58.289]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 24 lines ...
I0315 08:27:59.291] (Breplicationcontroller/frontend scaled
I0315 08:27:59.383] core.sh:1095: Successful get rc frontend {{.spec.replicas}}: 2
I0315 08:27:59.459] (Breplicationcontroller "frontend" deleted
W0315 08:27:59.560] I0315 08:27:58.477022   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552638476-5734", Name:"frontend", UID:"3c754e93-46fc-11e9-af77-0242ac110002", APIVersion:"v1", ResourceVersion:"1406", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-m6jgw
W0315 08:27:59.560] I0315 08:27:58.589118   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:27:59.561] I0315 08:27:58.589350   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:27:59.561] error: Expected replicas to be 3, was 2
W0315 08:27:59.561] I0315 08:27:59.025415   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552638476-5734", Name:"frontend", UID:"3c754e93-46fc-11e9-af77-0242ac110002", APIVersion:"v1", ResourceVersion:"1412", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-8lpsc
W0315 08:27:59.561] I0315 08:27:59.299260   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552638476-5734", Name:"frontend", UID:"3c754e93-46fc-11e9-af77-0242ac110002", APIVersion:"v1", ResourceVersion:"1417", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-8lpsc
W0315 08:27:59.590] I0315 08:27:59.589773   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:27:59.591] I0315 08:27:59.589990   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:27:59.632] I0315 08:27:59.631252   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552638476-5734", Name:"redis-master", UID:"3dedfcc6-46fc-11e9-af77-0242ac110002", APIVersion:"v1", ResourceVersion:"1428", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-spx6f
I0315 08:27:59.732] replicationcontroller/redis-master created
... skipping 40 lines ...
I0315 08:28:01.469] service "expose-test-deployment" deleted
I0315 08:28:01.575] Successful
I0315 08:28:01.575] message:service/expose-test-deployment exposed
I0315 08:28:01.576] has:service/expose-test-deployment exposed
I0315 08:28:01.658] service "expose-test-deployment" deleted
I0315 08:28:01.753] Successful
I0315 08:28:01.753] message:error: couldn't retrieve selectors via --selector flag or introspection: invalid deployment: no selectors, therefore cannot be exposed
I0315 08:28:01.754] See 'kubectl expose -h' for help and examples
I0315 08:28:01.754] has:invalid deployment: no selectors
I0315 08:28:01.840] Successful
I0315 08:28:01.840] message:error: couldn't retrieve selectors via --selector flag or introspection: invalid deployment: no selectors, therefore cannot be exposed
I0315 08:28:01.841] See 'kubectl expose -h' for help and examples
I0315 08:28:01.841] has:invalid deployment: no selectors
W0315 08:28:01.942] I0315 08:28:01.590761   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:28:01.942] I0315 08:28:01.590975   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:28:02.018] I0315 08:28:02.017886   58979 event.go:209] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1552638476-5734", Name:"nginx-deployment", UID:"3f5a18c1-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1533", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-64bb598779 to 3
W0315 08:28:02.023] I0315 08:28:02.022778   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638476-5734", Name:"nginx-deployment-64bb598779", UID:"3f5afea8-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1534", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-64bb598779-9mrv5
... skipping 27 lines ...
I0315 08:28:03.939] service "frontend" deleted
I0315 08:28:03.946] service "frontend-2" deleted
I0315 08:28:03.953] service "frontend-3" deleted
I0315 08:28:03.960] service "frontend-4" deleted
I0315 08:28:03.967] service "frontend-5" deleted
I0315 08:28:04.064] Successful
I0315 08:28:04.064] message:error: cannot expose a Node
I0315 08:28:04.064] has:cannot expose
I0315 08:28:04.153] Successful
I0315 08:28:04.154] message:The Service "invalid-large-service-name-that-has-more-than-sixty-three-characters" is invalid: metadata.name: Invalid value: "invalid-large-service-name-that-has-more-than-sixty-three-characters": must be no more than 63 characters
I0315 08:28:04.154] has:metadata.name: Invalid value
I0315 08:28:04.246] Successful
I0315 08:28:04.247] message:service/kubernetes-serve-hostname-testing-sixty-three-characters-in-len exposed
... skipping 38 lines ...
W0315 08:28:06.686] I0315 08:28:05.593078   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:28:06.687] I0315 08:28:05.983828   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552638476-5734", Name:"frontend", UID:"41b776bc-46fc-11e9-af77-0242ac110002", APIVersion:"v1", ResourceVersion:"1653", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-7ckf5
W0315 08:28:06.687] I0315 08:28:05.988407   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552638476-5734", Name:"frontend", UID:"41b776bc-46fc-11e9-af77-0242ac110002", APIVersion:"v1", ResourceVersion:"1653", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-mvh6f
W0315 08:28:06.687] I0315 08:28:05.988840   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552638476-5734", Name:"frontend", UID:"41b776bc-46fc-11e9-af77-0242ac110002", APIVersion:"v1", ResourceVersion:"1653", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-cvf4p
W0315 08:28:06.687] I0315 08:28:06.593414   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:28:06.688] I0315 08:28:06.593580   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:28:06.688] Error: required flag(s) "max" not set
W0315 08:28:06.688] 
W0315 08:28:06.688] 
W0315 08:28:06.688] Examples:
W0315 08:28:06.688]   # Auto scale a deployment "foo", with the number of pods between 2 and 10, no target CPU utilization specified so a default autoscaling policy will be used:
W0315 08:28:06.688]   kubectl autoscale deployment foo --min=2 --max=10
W0315 08:28:06.688]   
... skipping 55 lines ...
I0315 08:28:06.911]           limits:
I0315 08:28:06.911]             cpu: 300m
I0315 08:28:06.911]           requests:
I0315 08:28:06.911]             cpu: 300m
I0315 08:28:06.911]       terminationGracePeriodSeconds: 0
I0315 08:28:06.911] status: {}
W0315 08:28:07.012] Error from server (NotFound): deployments.apps "nginx-deployment-resources" not found
I0315 08:28:07.166] deployment.apps/nginx-deployment-resources created
W0315 08:28:07.266] I0315 08:28:07.171470   58979 event.go:209] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1552638476-5734", Name:"nginx-deployment-resources", UID:"426c7c12-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1674", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-695c766d58 to 3
W0315 08:28:07.267] I0315 08:28:07.176596   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638476-5734", Name:"nginx-deployment-resources-695c766d58", UID:"426d5c31-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1675", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-695c766d58-ffd84
W0315 08:28:07.267] I0315 08:28:07.182723   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638476-5734", Name:"nginx-deployment-resources-695c766d58", UID:"426d5c31-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1675", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-695c766d58-z785w
W0315 08:28:07.268] I0315 08:28:07.182804   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638476-5734", Name:"nginx-deployment-resources-695c766d58", UID:"426d5c31-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1675", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-695c766d58-rj74d
I0315 08:28:07.368] core.sh:1278: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment-resources:
... skipping 4 lines ...
I0315 08:28:07.771] (Bcore.sh:1284: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m:
I0315 08:28:07.969] (Bdeployment.extensions/nginx-deployment-resources resource requirements updated
W0315 08:28:08.070] I0315 08:28:07.573628   58979 event.go:209] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1552638476-5734", Name:"nginx-deployment-resources", UID:"426c7c12-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1688", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-5b7fc6dd8b to 1
W0315 08:28:08.070] I0315 08:28:07.578888   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638476-5734", Name:"nginx-deployment-resources-5b7fc6dd8b", UID:"42aaaec1-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1689", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-5b7fc6dd8b-b28cs
W0315 08:28:08.070] I0315 08:28:07.593922   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:28:08.071] I0315 08:28:07.594066   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:28:08.071] error: unable to find container named redis
W0315 08:28:08.071] I0315 08:28:07.991371   58979 event.go:209] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1552638476-5734", Name:"nginx-deployment-resources", UID:"426c7c12-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1697", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-resources-5b7fc6dd8b to 0
W0315 08:28:08.072] I0315 08:28:07.997621   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638476-5734", Name:"nginx-deployment-resources-5b7fc6dd8b", UID:"42aaaec1-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1701", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-resources-5b7fc6dd8b-b28cs
W0315 08:28:08.072] I0315 08:28:08.011940   58979 event.go:209] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1552638476-5734", Name:"nginx-deployment-resources", UID:"426c7c12-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1700", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-6bc4567bf6 to 1
W0315 08:28:08.072] I0315 08:28:08.017204   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638476-5734", Name:"nginx-deployment-resources-6bc4567bf6", UID:"42e7e959-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1707", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-6bc4567bf6-j9r5l
I0315 08:28:08.173] core.sh:1289: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
I0315 08:28:08.174] (Bcore.sh:1290: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m:
... skipping 213 lines ...
I0315 08:28:08.674]   observedGeneration: 4
I0315 08:28:08.674]   replicas: 4
I0315 08:28:08.674]   unavailableReplicas: 4
I0315 08:28:08.674]   updatedReplicas: 1
W0315 08:28:08.774] I0315 08:28:08.594402   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:28:08.775] I0315 08:28:08.594624   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:28:08.775] error: you must specify resources by --filename when --local is set.
W0315 08:28:08.775] Example resource specifications include:
W0315 08:28:08.775]    '-f rsrc.yaml'
W0315 08:28:08.775]    '--filename=rsrc.json'
I0315 08:28:08.876] core.sh:1299: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
I0315 08:28:08.907] (Bcore.sh:1300: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 300m:
I0315 08:28:09.000] (Bcore.sh:1301: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.requests.cpu}}:{{end}}: 300m:
... skipping 46 lines ...
I0315 08:28:10.526]                 pod-template-hash=7875bf5c8b
I0315 08:28:10.526] Annotations:    deployment.kubernetes.io/desired-replicas: 1
I0315 08:28:10.526]                 deployment.kubernetes.io/max-replicas: 2
I0315 08:28:10.526]                 deployment.kubernetes.io/revision: 1
I0315 08:28:10.526] Controlled By:  Deployment/test-nginx-apps
I0315 08:28:10.526] Replicas:       1 current / 1 desired
I0315 08:28:10.526] Pods Status:    0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0315 08:28:10.527] Pod Template:
I0315 08:28:10.527]   Labels:  app=test-nginx-apps
I0315 08:28:10.527]            pod-template-hash=7875bf5c8b
I0315 08:28:10.527]   Containers:
I0315 08:28:10.527]    nginx:
I0315 08:28:10.527]     Image:        k8s.gcr.io/nginx:test-cmd
... skipping 103 lines ...
W0315 08:28:14.756] I0315 08:28:14.598068   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:28:15.599] I0315 08:28:15.598486   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:28:15.600] I0315 08:28:15.598817   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 08:28:15.825] apps.sh:300: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0315 08:28:16.120] (Bapps.sh:303: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0315 08:28:16.290] (Bdeployment.extensions/nginx rolled back
W0315 08:28:16.390] error: unable to find specified revision 1000000 in history
W0315 08:28:16.600] I0315 08:28:16.599205   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:28:16.600] I0315 08:28:16.599489   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 08:28:17.434] apps.sh:307: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
I0315 08:28:17.580] (Bdeployment.extensions/nginx paused
W0315 08:28:17.681] I0315 08:28:17.599747   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:28:17.682] I0315 08:28:17.600016   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:28:17.764] error: you cannot rollback a paused deployment; resume it first with 'kubectl rollout resume deployment/nginx' and try again
I0315 08:28:17.915] deployment.extensions/nginx resumed
I0315 08:28:18.127] deployment.extensions/nginx rolled back
I0315 08:28:18.411]     deployment.kubernetes.io/revision-history: 1,3
W0315 08:28:18.601] I0315 08:28:18.600431   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:28:18.601] I0315 08:28:18.600863   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:28:18.642] error: desired revision (3) is different from the running revision (5)
I0315 08:28:18.886] deployment.apps/nginx2 created
W0315 08:28:18.987] I0315 08:28:18.893348   58979 event.go:209] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1552638489-4110", Name:"nginx2", UID:"4968b055-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1918", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx2-78cb9c866 to 3
W0315 08:28:18.988] I0315 08:28:18.898803   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638489-4110", Name:"nginx2-78cb9c866", UID:"4969dae7-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1919", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-78cb9c866-nvhr2
W0315 08:28:18.988] I0315 08:28:18.905209   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638489-4110", Name:"nginx2-78cb9c866", UID:"4969dae7-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1919", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-78cb9c866-lnx65
W0315 08:28:18.989] I0315 08:28:18.905567   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638489-4110", Name:"nginx2-78cb9c866", UID:"4969dae7-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1919", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-78cb9c866-gls68
I0315 08:28:19.089] deployment.extensions "nginx2" deleted
... skipping 12 lines ...
I0315 08:28:20.057] (Bdeployment.extensions/nginx-deployment image updated
W0315 08:28:20.158] I0315 08:28:20.063650   58979 event.go:209] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1552638489-4110", Name:"nginx-deployment", UID:"49c5252a-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1965", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-5bfd55c857 to 1
W0315 08:28:20.158] I0315 08:28:20.071091   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638489-4110", Name:"nginx-deployment-5bfd55c857", UID:"4a1c5fd7-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1966", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-5bfd55c857-mhnkt
I0315 08:28:20.259] apps.sh:337: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
I0315 08:28:20.336] (Bapps.sh:338: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
I0315 08:28:20.617] (Bdeployment.extensions/nginx-deployment image updated
W0315 08:28:20.717] error: unable to find container named "redis"
W0315 08:28:20.718] I0315 08:28:20.601739   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:28:20.718] I0315 08:28:20.602027   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 08:28:20.819] apps.sh:343: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0315 08:28:20.889] (Bapps.sh:344: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
I0315 08:28:21.027] (Bdeployment.apps/nginx-deployment image updated
W0315 08:28:21.155] I0315 08:28:21.154848   58979 horizontal.go:320] Horizontal Pod Autoscaler frontend has been deleted in namespace-1552638476-5734
... skipping 75 lines ...
I0315 08:28:25.202] (Breplicaset.apps/frontend created
I0315 08:28:25.214] +++ [0315 08:28:25] Deleting rs
I0315 08:28:25.293] replicaset.extensions "frontend" deleted
I0315 08:28:25.391] apps.sh:508: Successful get pods -l "tier=frontend" {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 08:28:25.480] (Bapps.sh:512: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 08:28:25.637] (Breplicaset.apps/frontend-no-cascade created
W0315 08:28:25.745] E0315 08:28:24.472670   58979 replica_set.go:450] Sync "namespace-1552638489-4110/nginx-deployment-54979c5b5c" failed with Operation cannot be fulfilled on replicasets.apps "nginx-deployment-54979c5b5c": StorageError: invalid object, Code: 4, Key: /registry/replicasets/namespace-1552638489-4110/nginx-deployment-54979c5b5c, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 4c33523b-46fc-11e9-af77-0242ac110002, UID in object meta: 
W0315 08:28:25.750] I0315 08:28:24.476177   58979 event.go:209] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1552638489-4110", Name:"nginx-deployment", UID:"4b92ad5d-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2120", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-5cc58864fb to 1
W0315 08:28:25.750] I0315 08:28:24.522492   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638489-4110", Name:"nginx-deployment-58dbcd7c7f", UID:"4c01d651-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2118", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-58dbcd7c7f-vlp6h
W0315 08:28:25.750] I0315 08:28:24.603739   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:28:25.750] I0315 08:28:24.604108   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:28:25.751] I0315 08:28:24.627847   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638489-4110", Name:"nginx-deployment-5cc58864fb", UID:"4cbcdace-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2133", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-5cc58864fb-gxmml
W0315 08:28:25.751] E0315 08:28:24.772774   58979 replica_set.go:450] Sync "namespace-1552638489-4110/nginx-deployment-5b4bdf69f4" failed with replicasets.apps "nginx-deployment-5b4bdf69f4" not found
W0315 08:28:25.751] E0315 08:28:24.922546   58979 replica_set.go:450] Sync "namespace-1552638489-4110/nginx-deployment-58dbcd7c7f" failed with replicasets.apps "nginx-deployment-58dbcd7c7f" not found
W0315 08:28:25.751] E0315 08:28:24.972334   58979 replica_set.go:450] Sync "namespace-1552638489-4110/nginx-deployment-5cc58864fb" failed with replicasets.apps "nginx-deployment-5cc58864fb" not found
W0315 08:28:25.751] I0315 08:28:25.210540   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638504-32485", Name:"frontend", UID:"4d2cb66d-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2153", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-cv9j9
W0315 08:28:25.752] I0315 08:28:25.214882   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638504-32485", Name:"frontend", UID:"4d2cb66d-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2153", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-qfrg5
W0315 08:28:25.752] I0315 08:28:25.216090   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638504-32485", Name:"frontend", UID:"4d2cb66d-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2153", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-9kk9k
W0315 08:28:25.752] E0315 08:28:25.422275   58979 replica_set.go:450] Sync "namespace-1552638504-32485/frontend" failed with replicasets.apps "frontend" not found
W0315 08:28:25.752] I0315 08:28:25.604409   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:28:25.753] I0315 08:28:25.604696   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:28:25.753] I0315 08:28:25.643091   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638504-32485", Name:"frontend-no-cascade", UID:"4d6f0bdd-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2168", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-no-cascade-rh4sr
W0315 08:28:25.753] I0315 08:28:25.648259   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638504-32485", Name:"frontend-no-cascade", UID:"4d6f0bdd-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2168", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-no-cascade-xcm4x
W0315 08:28:25.754] I0315 08:28:25.649163   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638504-32485", Name:"frontend-no-cascade", UID:"4d6f0bdd-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2168", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-no-cascade-kmmvk
I0315 08:28:25.854] apps.sh:518: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
I0315 08:28:25.855] (B+++ [0315 08:28:25] Deleting rs
I0315 08:28:25.855] replicaset.extensions "frontend-no-cascade" deleted
W0315 08:28:25.955] E0315 08:28:25.871765   58979 replica_set.go:450] Sync "namespace-1552638504-32485/frontend-no-cascade" failed with replicasets.apps "frontend-no-cascade" not found
I0315 08:28:26.056] apps.sh:522: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 08:28:26.058] (Bapps.sh:524: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
I0315 08:28:26.143] (Bpod "frontend-no-cascade-kmmvk" deleted
I0315 08:28:26.150] pod "frontend-no-cascade-rh4sr" deleted
I0315 08:28:26.156] pod "frontend-no-cascade-xcm4x" deleted
I0315 08:28:26.254] apps.sh:527: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
... skipping 5 lines ...
I0315 08:28:26.733] Namespace:    namespace-1552638504-32485
I0315 08:28:26.733] Selector:     app=guestbook,tier=frontend
I0315 08:28:26.733] Labels:       app=guestbook
I0315 08:28:26.733]               tier=frontend
I0315 08:28:26.734] Annotations:  <none>
I0315 08:28:26.734] Replicas:     3 current / 3 desired
I0315 08:28:26.734] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 08:28:26.734] Pod Template:
I0315 08:28:26.734]   Labels:  app=guestbook
I0315 08:28:26.734]            tier=frontend
I0315 08:28:26.734]   Containers:
I0315 08:28:26.734]    php-redis:
I0315 08:28:26.735]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
I0315 08:28:26.850] Namespace:    namespace-1552638504-32485
I0315 08:28:26.850] Selector:     app=guestbook,tier=frontend
I0315 08:28:26.850] Labels:       app=guestbook
I0315 08:28:26.850]               tier=frontend
I0315 08:28:26.850] Annotations:  <none>
I0315 08:28:26.851] Replicas:     3 current / 3 desired
I0315 08:28:26.851] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 08:28:26.851] Pod Template:
I0315 08:28:26.851]   Labels:  app=guestbook
I0315 08:28:26.851]            tier=frontend
I0315 08:28:26.851]   Containers:
I0315 08:28:26.851]    php-redis:
I0315 08:28:26.851]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 23 lines ...
I0315 08:28:27.056] Namespace:    namespace-1552638504-32485
I0315 08:28:27.056] Selector:     app=guestbook,tier=frontend
I0315 08:28:27.056] Labels:       app=guestbook
I0315 08:28:27.056]               tier=frontend
I0315 08:28:27.056] Annotations:  <none>
I0315 08:28:27.057] Replicas:     3 current / 3 desired
I0315 08:28:27.057] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 08:28:27.057] Pod Template:
I0315 08:28:27.057]   Labels:  app=guestbook
I0315 08:28:27.057]            tier=frontend
I0315 08:28:27.057]   Containers:
I0315 08:28:27.057]    php-redis:
I0315 08:28:27.057]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 12 lines ...
I0315 08:28:27.092] Namespace:    namespace-1552638504-32485
I0315 08:28:27.092] Selector:     app=guestbook,tier=frontend
I0315 08:28:27.092] Labels:       app=guestbook
I0315 08:28:27.092]               tier=frontend
I0315 08:28:27.092] Annotations:  <none>
I0315 08:28:27.093] Replicas:     3 current / 3 desired
I0315 08:28:27.093] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 08:28:27.093] Pod Template:
I0315 08:28:27.093]   Labels:  app=guestbook
I0315 08:28:27.093]            tier=frontend
I0315 08:28:27.094]   Containers:
I0315 08:28:27.094]    php-redis:
I0315 08:28:27.094]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 18 lines ...
I0315 08:28:27.240] Namespace:    namespace-1552638504-32485
I0315 08:28:27.240] Selector:     app=guestbook,tier=frontend
I0315 08:28:27.240] Labels:       app=guestbook
I0315 08:28:27.241]               tier=frontend
I0315 08:28:27.241] Annotations:  <none>
I0315 08:28:27.241] Replicas:     3 current / 3 desired
I0315 08:28:27.241] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 08:28:27.241] Pod Template:
I0315 08:28:27.241]   Labels:  app=guestbook
I0315 08:28:27.241]            tier=frontend
I0315 08:28:27.242]   Containers:
I0315 08:28:27.242]    php-redis:
I0315 08:28:27.242]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
I0315 08:28:27.352] Namespace:    namespace-1552638504-32485
I0315 08:28:27.352] Selector:     app=guestbook,tier=frontend
I0315 08:28:27.352] Labels:       app=guestbook
I0315 08:28:27.352]               tier=frontend
I0315 08:28:27.352] Annotations:  <none>
I0315 08:28:27.352] Replicas:     3 current / 3 desired
I0315 08:28:27.352] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 08:28:27.353] Pod Template:
I0315 08:28:27.353]   Labels:  app=guestbook
I0315 08:28:27.353]            tier=frontend
I0315 08:28:27.353]   Containers:
I0315 08:28:27.353]    php-redis:
I0315 08:28:27.353]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
I0315 08:28:27.461] Namespace:    namespace-1552638504-32485
I0315 08:28:27.461] Selector:     app=guestbook,tier=frontend
I0315 08:28:27.461] Labels:       app=guestbook
I0315 08:28:27.461]               tier=frontend
I0315 08:28:27.461] Annotations:  <none>
I0315 08:28:27.461] Replicas:     3 current / 3 desired
I0315 08:28:27.461] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 08:28:27.462] Pod Template:
I0315 08:28:27.462]   Labels:  app=guestbook
I0315 08:28:27.462]            tier=frontend
I0315 08:28:27.462]   Containers:
I0315 08:28:27.462]    php-redis:
I0315 08:28:27.462]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 11 lines ...
I0315 08:28:27.586] Namespace:    namespace-1552638504-32485
I0315 08:28:27.586] Selector:     app=guestbook,tier=frontend
I0315 08:28:27.587] Labels:       app=guestbook
I0315 08:28:27.587]               tier=frontend
I0315 08:28:27.587] Annotations:  <none>
I0315 08:28:27.587] Replicas:     3 current / 3 desired
I0315 08:28:27.588] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 08:28:27.588] Pod Template:
I0315 08:28:27.588]   Labels:  app=guestbook
I0315 08:28:27.588]            tier=frontend
I0315 08:28:27.588]   Containers:
I0315 08:28:27.589]    php-redis:
I0315 08:28:27.589]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 196 lines ...
I0315 08:28:33.154] (Bhorizontalpodautoscaler.autoscaling "frontend" deleted
W0315 08:28:33.255] I0315 08:28:32.563348   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638504-32485", Name:"frontend", UID:"518efdb6-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2379", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-xphfn
W0315 08:28:33.255] I0315 08:28:32.567608   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638504-32485", Name:"frontend", UID:"518efdb6-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2379", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-2vj4q
W0315 08:28:33.256] I0315 08:28:32.567943   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552638504-32485", Name:"frontend", UID:"518efdb6-46fc-11e9-af77-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2379", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-nsx4f
W0315 08:28:33.256] I0315 08:28:32.607910   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:28:33.257] I0315 08:28:32.608089   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:28:33.257] Error: required flag(s) "max" not set
W0315 08:28:33.257] 
W0315 08:28:33.257] 
W0315 08:28:33.257] Examples:
W0315 08:28:33.258]   # Auto scale a deployment "foo", with the number of pods between 2 and 10, no target CPU utilization specified so a default autoscaling policy will be used:
W0315 08:28:33.258]   kubectl autoscale deployment foo --min=2 --max=10
W0315 08:28:33.258]   
... skipping 97 lines ...
I0315 08:28:36.539] (Bstatefulset.apps/nginx rolled back
W0315 08:28:36.639] I0315 08:28:36.610277   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:28:36.640] I0315 08:28:36.610493   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 08:28:36.740] apps.sh:435: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
I0315 08:28:36.748] (Bapps.sh:436: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I0315 08:28:36.857] (BSuccessful
I0315 08:28:36.858] message:error: unable to find specified revision 1000000 in history
I0315 08:28:36.858] has:unable to find specified revision
I0315 08:28:36.954] apps.sh:440: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
I0315 08:28:37.047] (Bapps.sh:441: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I0315 08:28:37.159] (Bstatefulset.apps/nginx rolled back
I0315 08:28:37.259] apps.sh:444: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
I0315 08:28:37.365] (Bapps.sh:445: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
... skipping 62 lines ...
I0315 08:28:39.256] Name:         mock
I0315 08:28:39.257] Namespace:    namespace-1552638518-2412
I0315 08:28:39.257] Selector:     app=mock
I0315 08:28:39.257] Labels:       app=mock
I0315 08:28:39.257] Annotations:  <none>
I0315 08:28:39.257] Replicas:     1 current / 1 desired
I0315 08:28:39.258] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0315 08:28:39.258] Pod Template:
I0315 08:28:39.258]   Labels:  app=mock
I0315 08:28:39.258]   Containers:
I0315 08:28:39.258]    mock-container:
I0315 08:28:39.259]     Image:        k8s.gcr.io/pause:2.0
I0315 08:28:39.259]     Port:         9949/TCP
... skipping 60 lines ...
I0315 08:28:41.551] Name:         mock
I0315 08:28:41.551] Namespace:    namespace-1552638518-2412
I0315 08:28:41.551] Selector:     app=mock
I0315 08:28:41.551] Labels:       app=mock
I0315 08:28:41.551] Annotations:  <none>
I0315 08:28:41.551] Replicas:     1 current / 1 desired
I0315 08:28:41.551] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0315 08:28:41.552] Pod Template:
I0315 08:28:41.552]   Labels:  app=mock
I0315 08:28:41.552]   Containers:
I0315 08:28:41.552]    mock-container:
I0315 08:28:41.552]     Image:        k8s.gcr.io/pause:2.0
I0315 08:28:41.552]     Port:         9949/TCP
... skipping 62 lines ...
I0315 08:28:43.790] Name:         mock
I0315 08:28:43.790] Namespace:    namespace-1552638518-2412
I0315 08:28:43.790] Selector:     app=mock
I0315 08:28:43.791] Labels:       app=mock
I0315 08:28:43.791] Annotations:  <none>
I0315 08:28:43.791] Replicas:     1 current / 1 desired
I0315 08:28:43.791] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0315 08:28:43.791] Pod Template:
I0315 08:28:43.792]   Labels:  app=mock
I0315 08:28:43.792]   Containers:
I0315 08:28:43.792]    mock-container:
I0315 08:28:43.792]     Image:        k8s.gcr.io/pause:2.0
I0315 08:28:43.792]     Port:         9949/TCP
... skipping 46 lines ...
I0315 08:28:45.928] Namespace:    namespace-1552638518-2412
I0315 08:28:45.928] Selector:     app=mock
I0315 08:28:45.928] Labels:       app=mock
I0315 08:28:45.928]               status=replaced
I0315 08:28:45.928] Annotations:  <none>
I0315 08:28:45.928] Replicas:     1 current / 1 desired
I0315 08:28:45.928] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0315 08:28:45.929] Pod Template:
I0315 08:28:45.929]   Labels:  app=mock
I0315 08:28:45.929]   Containers:
I0315 08:28:45.929]    mock-container:
I0315 08:28:45.929]     Image:        k8s.gcr.io/pause:2.0
I0315 08:28:45.929]     Port:         9949/TCP
... skipping 11 lines ...
I0315 08:28:45.931] Namespace:    namespace-1552638518-2412
I0315 08:28:45.931] Selector:     app=mock2
I0315 08:28:45.931] Labels:       app=mock2
I0315 08:28:45.932]               status=replaced
I0315 08:28:45.932] Annotations:  <none>
I0315 08:28:45.932] Replicas:     1 current / 1 desired
I0315 08:28:45.932] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0315 08:28:45.932] Pod Template:
I0315 08:28:45.933]   Labels:  app=mock2
I0315 08:28:45.933]   Containers:
I0315 08:28:45.933]    mock-container:
I0315 08:28:45.933]     Image:        k8s.gcr.io/pause:2.0
I0315 08:28:45.933]     Port:         9949/TCP
... skipping 117 lines ...
W0315 08:28:51.466] I0315 08:28:48.617001   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:28:51.466] I0315 08:28:49.617225   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:28:51.467] I0315 08:28:49.617438   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:28:51.467] I0315 08:28:50.025227   58979 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552638518-2412", Name:"mock", UID:"5bf7c9f1-46fc-11e9-af77-0242ac110002", APIVersion:"v1", ResourceVersion:"2647", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-p9pw6
W0315 08:28:51.467] I0315 08:28:50.617701   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:28:51.468] I0315 08:28:50.618072   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:28:51.538] E0315 08:28:51.537730   58979 pv_protection_controller.go:116] PV pv0002 failed with : Operation cannot be fulfilled on persistentvolumes "pv0002": the object has been modified; please apply your changes to the latest version and try again
W0315 08:28:51.619] I0315 08:28:51.618288   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:28:51.619] I0315 08:28:51.618460   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 08:28:51.720] persistentvolume/pv0002 created
I0315 08:28:51.720] storage.sh:36: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0002:
I0315 08:28:51.720] (Bpersistentvolume "pv0002" deleted
I0315 08:28:51.910] persistentvolume/pv0003 created
... skipping 500 lines ...
I0315 08:28:56.760] yes
I0315 08:28:56.761] has:the server doesn't have a resource type
I0315 08:28:56.839] Successful
I0315 08:28:56.840] message:yes
I0315 08:28:56.840] has:yes
I0315 08:28:56.916] Successful
I0315 08:28:56.917] message:error: --subresource can not be used with NonResourceURL
I0315 08:28:56.917] has:subresource can not be used with NonResourceURL
I0315 08:28:56.998] Successful
I0315 08:28:57.082] Successful
I0315 08:28:57.083] message:yes
I0315 08:28:57.083] 0
I0315 08:28:57.083] has:0
... skipping 6 lines ...
I0315 08:28:57.271] role.rbac.authorization.k8s.io/testing-R reconciled
I0315 08:28:57.366] legacy-script.sh:769: Successful get rolebindings -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-RB:
I0315 08:28:57.461] (Blegacy-script.sh:770: Successful get roles -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-R:
I0315 08:28:57.556] (Blegacy-script.sh:771: Successful get clusterrolebindings -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CRB:
I0315 08:28:57.656] (Blegacy-script.sh:772: Successful get clusterroles -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CR:
I0315 08:28:57.739] (BSuccessful
I0315 08:28:57.740] message:error: only rbac.authorization.k8s.io/v1 is supported: not *v1beta1.ClusterRole
I0315 08:28:57.740] has:only rbac.authorization.k8s.io/v1 is supported
I0315 08:28:57.831] rolebinding.rbac.authorization.k8s.io "testing-RB" deleted
I0315 08:28:57.837] role.rbac.authorization.k8s.io "testing-R" deleted
I0315 08:28:57.855] clusterrole.rbac.authorization.k8s.io "testing-CR" deleted
I0315 08:28:57.868] clusterrolebinding.rbac.authorization.k8s.io "testing-CRB" deleted
I0315 08:28:57.879] Recording: run_retrieve_multiple_tests
... skipping 1211 lines ...
I0315 08:29:25.702] message:node/127.0.0.1 already uncordoned (dry run)
I0315 08:29:25.703] has:already uncordoned
I0315 08:29:25.794] node-management.sh:119: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
I0315 08:29:25.901] (Bnode/127.0.0.1 labeled
I0315 08:29:25.996] node-management.sh:124: Successful get nodes 127.0.0.1 {{.metadata.labels.test}}: label
I0315 08:29:26.067] (BSuccessful
I0315 08:29:26.068] message:error: cannot specify both a node name and a --selector option
I0315 08:29:26.068] See 'kubectl drain -h' for help and examples
I0315 08:29:26.068] has:cannot specify both a node name
I0315 08:29:26.136] Successful
I0315 08:29:26.136] message:error: USAGE: cordon NODE [flags]
I0315 08:29:26.136] See 'kubectl cordon -h' for help and examples
I0315 08:29:26.137] has:error\: USAGE\: cordon NODE
I0315 08:29:26.213] node/127.0.0.1 already uncordoned
I0315 08:29:26.288] Successful
I0315 08:29:26.288] message:error: You must provide one or more resources by argument or filename.
I0315 08:29:26.288] Example resource specifications include:
I0315 08:29:26.289]    '-f rsrc.yaml'
I0315 08:29:26.289]    '--filename=rsrc.json'
I0315 08:29:26.289]    '<resource> <name>'
I0315 08:29:26.289]    '<resource>'
I0315 08:29:26.289] has:must provide one or more resources
... skipping 15 lines ...
I0315 08:29:26.725] Successful
I0315 08:29:26.725] message:The following compatible plugins are available:
I0315 08:29:26.725] 
I0315 08:29:26.725] test/fixtures/pkg/kubectl/plugins/version/kubectl-version
I0315 08:29:26.725]   - warning: kubectl-version overwrites existing command: "kubectl version"
I0315 08:29:26.725] 
I0315 08:29:26.726] error: one plugin warning was found
I0315 08:29:26.726] has:kubectl-version overwrites existing command: "kubectl version"
I0315 08:29:26.801] Successful
I0315 08:29:26.801] message:The following compatible plugins are available:
I0315 08:29:26.801] 
I0315 08:29:26.801] test/fixtures/pkg/kubectl/plugins/kubectl-foo
I0315 08:29:26.802] test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo
I0315 08:29:26.802]   - warning: test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin: test/fixtures/pkg/kubectl/plugins/kubectl-foo
I0315 08:29:26.802] 
I0315 08:29:26.802] error: one plugin warning was found
I0315 08:29:26.802] has:test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin
I0315 08:29:26.878] Successful
I0315 08:29:26.878] message:The following compatible plugins are available:
I0315 08:29:26.878] 
I0315 08:29:26.879] test/fixtures/pkg/kubectl/plugins/kubectl-foo
I0315 08:29:26.879] has:plugins are available
I0315 08:29:26.953] Successful
I0315 08:29:26.953] message:
I0315 08:29:26.954] error: unable to find any kubectl plugins in your PATH
I0315 08:29:26.954] has:unable to find any kubectl plugins in your PATH
I0315 08:29:27.025] Successful
I0315 08:29:27.026] message:I am plugin foo
I0315 08:29:27.026] has:plugin foo
I0315 08:29:27.098] Successful
I0315 08:29:27.099] message:Client Version: version.Info{Major:"1", Minor:"15+", GitVersion:"v1.15.0-alpha.0.1222+d5a3db003916b1", GitCommit:"d5a3db003916b1d33b503ccd2e4897e094d8af77", GitTreeState:"clean", BuildDate:"2019-03-15T08:22:37Z", GoVersion:"go1.11.5", Compiler:"gc", Platform:"linux/amd64"}
... skipping 9 lines ...
I0315 08:29:27.174] 
I0315 08:29:27.176] +++ Running case: test-cmd.run_impersonation_tests 
I0315 08:29:27.178] +++ working dir: /go/src/k8s.io/kubernetes
I0315 08:29:27.181] +++ command: run_impersonation_tests
I0315 08:29:27.190] +++ [0315 08:29:27] Testing impersonation
I0315 08:29:27.261] Successful
I0315 08:29:27.261] message:error: requesting groups or user-extra for  without impersonating a user
I0315 08:29:27.261] has:without impersonating a user
W0315 08:29:27.362] I0315 08:29:24.636475   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:29:27.362] I0315 08:29:24.636752   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:29:27.363] I0315 08:29:25.637021   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 08:29:27.363] I0315 08:29:25.637299   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 08:29:27.363] I0315 08:29:26.637534   55692 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
... skipping 88 lines ...
W0315 08:29:30.989] I0315 08:29:30.977965   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.990] I0315 08:29:30.978012   55692 secure_serving.go:160] Stopped listening on 127.0.0.1:6443
W0315 08:29:30.990] I0315 08:29:30.977421   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.990] I0315 08:29:30.978052   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.990] I0315 08:29:30.978093   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.990] I0315 08:29:30.978119   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.990] W0315 08:29:30.978189   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:30.991] I0315 08:29:30.978193   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.991] I0315 08:29:30.978224   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.991] W0315 08:29:30.978251   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:30.991] I0315 08:29:30.978266   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.991] I0315 08:29:30.978285   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.992] W0315 08:29:30.978325   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:30.992] W0315 08:29:30.978352   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:30.992] W0315 08:29:30.978397   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:30.992] W0315 08:29:30.978533   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:30.993] W0315 08:29:30.978545   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:30.993] I0315 08:29:30.978600   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.993] I0315 08:29:30.978614   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.993] I0315 08:29:30.978640   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.993] I0315 08:29:30.978648   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.993] I0315 08:29:30.978673   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.994] I0315 08:29:30.978681   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.994] W0315 08:29:30.978686   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:30.994] I0315 08:29:30.978792   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.994] I0315 08:29:30.978815   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.994] I0315 08:29:30.978854   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.994] W0315 08:29:30.978861   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:30.995] I0315 08:29:30.978867   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.995] I0315 08:29:30.978906   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.995] I0315 08:29:30.978941   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.995] I0315 08:29:30.978949   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.995] I0315 08:29:30.978964   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.995] W0315 08:29:30.978978   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:30.996] I0315 08:29:30.978992   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.996] I0315 08:29:30.979011   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.996] I0315 08:29:30.979044   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.996] I0315 08:29:30.979055   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.996] I0315 08:29:30.979084   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.996] I0315 08:29:30.978911   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.997] I0315 08:29:30.979113   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.997] W0315 08:29:30.979058   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:30.997] I0315 08:29:30.979068   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.997] W0315 08:29:30.979353   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:30.997] I0315 08:29:30.979460   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.997] I0315 08:29:30.979485   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.998] I0315 08:29:30.979511   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.998] I0315 08:29:30.979524   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.998] I0315 08:29:30.979557   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.998] I0315 08:29:30.979579   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.998] I0315 08:29:30.979683   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.998] I0315 08:29:30.979739   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.999] W0315 08:29:30.979774   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:30.999] I0315 08:29:30.979792   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.999] I0315 08:29:30.979829   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.999] I0315 08:29:30.979869   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:30.999] I0315 08:29:30.979831   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.000] W0315 08:29:30.979889   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.000] W0315 08:29:30.979952   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.000] W0315 08:29:30.980014   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.000] W0315 08:29:30.980080   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.000] W0315 08:29:30.980174   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.001] W0315 08:29:30.980248   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.001] W0315 08:29:30.980311   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.001] I0315 08:29:30.980381   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.001] W0315 08:29:30.980436   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.002] W0315 08:29:30.980507   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.002] I0315 08:29:30.980440   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.002] W0315 08:29:30.980585   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.002] W0315 08:29:30.980668   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.002] I0315 08:29:30.980676   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.003] I0315 08:29:30.980736   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.003] W0315 08:29:30.980808   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.003] W0315 08:29:30.980871   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.003] I0315 08:29:30.980901   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.003] I0315 08:29:30.981052   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.004] W0315 08:29:30.981065   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.004] W0315 08:29:30.980397   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.004] W0315 08:29:30.980930   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.004] I0315 08:29:30.981197   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.005] I0315 08:29:30.981253   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.005] I0315 08:29:30.981292   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.005] I0315 08:29:30.981262   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.005] I0315 08:29:30.981306   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.005] I0315 08:29:30.981332   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.005] W0315 08:29:30.981366   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.006] W0315 08:29:30.981405   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.006] W0315 08:29:30.981580   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.006] I0315 08:29:30.982015   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.006] I0315 08:29:30.982098   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.006] I0315 08:29:30.982138   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.006] I0315 08:29:30.982193   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.007] I0315 08:29:30.982219   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.007] I0315 08:29:30.982261   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.007] I0315 08:29:30.982283   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.007] W0315 08:29:30.982414   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.007] W0315 08:29:30.982453   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.008] W0315 08:29:30.982545   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.008] W0315 08:29:30.982608   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.008] W0315 08:29:30.982622   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.008] I0315 08:29:30.982654   55692 picker_wrapper.go:218] blockingPicker: the picked transport is not ready, loop back to repick
W0315 08:29:31.009] W0315 08:29:30.982659   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.009] I0315 08:29:30.982681   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.009] I0315 08:29:30.982698   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.009] W0315 08:29:30.982813   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.009] W0315 08:29:30.982860   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.010] W0315 08:29:30.982559   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.010] W0315 08:29:30.982609   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.010] W0315 08:29:30.982990   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.010] W0315 08:29:30.983059   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.011] W0315 08:29:30.982559   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.011] I0315 08:29:30.983146   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.011] I0315 08:29:30.983174   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.011] I0315 08:29:30.983627   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: []
W0315 08:29:31.011] I0315 08:29:30.984922   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.011] I0315 08:29:30.985147   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.012] I0315 08:29:30.983763   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
... skipping 6 lines ...
W0315 08:29:31.013] I0315 08:29:30.983926   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.013] I0315 08:29:30.983959   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.013] I0315 08:29:30.984000   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.013] I0315 08:29:30.984020   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.013] I0315 08:29:30.984059   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.013] I0315 08:29:30.984109   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.014] W0315 08:29:30.984141   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.014] I0315 08:29:30.984147   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.014] W0315 08:29:30.984150   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.014] I0315 08:29:30.984182   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.014] I0315 08:29:30.984231   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.015] I0315 08:29:30.984286   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.015] I0315 08:29:30.984319   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.015] I0315 08:29:30.984370   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.015] W0315 08:29:30.984517   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.015] W0315 08:29:30.984535   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.016] W0315 08:29:30.984545   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.016] W0315 08:29:30.984564   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.016] W0315 08:29:30.984577   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.016] I0315 08:29:30.984575   55692 picker_wrapper.go:218] blockingPicker: the picked transport is not ready, loop back to repick
W0315 08:29:31.016] W0315 08:29:30.984581   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.017] W0315 08:29:30.984598   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.017] W0315 08:29:30.984603   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.017] W0315 08:29:30.984608   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.017] W0315 08:29:30.984619   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.018] W0315 08:29:30.984641   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.018] W0315 08:29:30.984648   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.018] W0315 08:29:30.980969   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.018] W0315 08:29:30.984654   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.019] W0315 08:29:30.984674   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.019] W0315 08:29:30.984692   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.019] W0315 08:29:30.984695   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.019] W0315 08:29:30.984701   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.019] W0315 08:29:30.984740   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.020] W0315 08:29:30.984765   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.020] W0315 08:29:30.984775   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.020] I0315 08:29:30.984787   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.020] W0315 08:29:30.984795   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.021] W0315 08:29:30.984802   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.021] I0315 08:29:30.984811   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.021] I0315 08:29:30.984826   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.021] I0315 08:29:30.984831   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.021] I0315 08:29:30.984855   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.021] I0315 08:29:30.984974   55692 picker_wrapper.go:218] blockingPicker: the picked transport is not ready, loop back to repick
W0315 08:29:31.021] E0315 08:29:30.984981   55692 controller.go:179] rpc error: code = Unavailable desc = transport is closing
W0315 08:29:31.022] I0315 08:29:30.984990   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.022] W0315 08:29:30.985049   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.022] W0315 08:29:30.985050   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.022] W0315 08:29:30.985102   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.022] I0315 08:29:30.985182   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.023] W0315 08:29:30.985192   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.023] I0315 08:29:30.985215   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.023] I0315 08:29:30.985216   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.023] I0315 08:29:30.985515   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.023] I0315 08:29:30.985532   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.024] I0315 08:29:30.985561   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.024] I0315 08:29:30.985600   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
... skipping 4 lines ...
W0315 08:29:31.024] I0315 08:29:30.985702   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.025] I0315 08:29:30.985761   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.025] I0315 08:29:30.985794   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.025] I0315 08:29:30.985834   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.025] I0315 08:29:30.985860   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.025] I0315 08:29:30.985881   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.025] W0315 08:29:30.985902   55692 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 08:29:31.026] I0315 08:29:30.985905   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.026] I0315 08:29:30.985926   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.026] I0315 08:29:30.986216   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.026] I0315 08:29:30.986369   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.026] I0315 08:29:30.986469   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 08:29:31.026] I0315 08:29:30.986465   55692 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
... skipping 50 lines ...
I0315 08:42:29.790] ok  	k8s.io/kubernetes/test/integration/serving	53.725s
I0315 08:42:29.791] ok  	k8s.io/kubernetes/test/integration/statefulset	12.497s
I0315 08:42:29.791] ok  	k8s.io/kubernetes/test/integration/storageclasses	5.042s
I0315 08:42:29.791] ok  	k8s.io/kubernetes/test/integration/tls	8.922s
I0315 08:42:29.791] ok  	k8s.io/kubernetes/test/integration/ttlcontroller	11.358s
I0315 08:42:29.791] ok  	k8s.io/kubernetes/test/integration/volume	95.089s
I0315 08:42:29.791] FAIL	k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/test/integration	93.214s
I0315 08:42:45.474] +++ [0315 08:42:45] Saved JUnit XML test report to /workspace/artifacts/junit_d431ed5f68ae4ddf888439fb96b687a923412204_20190315-082941.xml
I0315 08:42:45.476] Makefile:184: recipe for target 'test' failed
I0315 08:42:45.486] +++ [0315 08:42:45] Cleaning up etcd
W0315 08:42:45.587] make[1]: *** [test] Error 1
W0315 08:42:45.587] !!! [0315 08:42:45] Call tree:
W0315 08:42:45.587] !!! [0315 08:42:45]  1: hack/make-rules/test-integration.sh:99 runTests(...)
I0315 08:42:45.765] +++ [0315 08:42:45] Integration test cleanup complete
I0315 08:42:45.766] Makefile:203: recipe for target 'test-integration' failed
W0315 08:42:45.866] make: *** [test-integration] Error 1
W0315 08:42:48.565] Traceback (most recent call last):
W0315 08:42:48.566]   File "/workspace/./test-infra/jenkins/../scenarios/kubernetes_verify.py", line 178, in <module>
W0315 08:42:48.566]     ARGS.exclude_typecheck, ARGS.exclude_godep)
W0315 08:42:48.566]   File "/workspace/./test-infra/jenkins/../scenarios/kubernetes_verify.py", line 140, in main
W0315 08:42:48.566]     check(*cmd)
W0315 08:42:48.566]   File "/workspace/./test-infra/jenkins/../scenarios/kubernetes_verify.py", line 48, in check
W0315 08:42:48.566]     subprocess.check_call(cmd)
W0315 08:42:48.566]   File "/usr/lib/python2.7/subprocess.py", line 186, in check_call
W0315 08:42:48.596]     raise CalledProcessError(retcode, cmd)
W0315 08:42:48.597] subprocess.CalledProcessError: Command '('docker', 'run', '--rm=true', '--privileged=true', '-v', '/var/run/docker.sock:/var/run/docker.sock', '-v', '/etc/localtime:/etc/localtime:ro', '-v', '/workspace/k8s.io/kubernetes:/go/src/k8s.io/kubernetes', '-v', '/workspace/k8s.io/:/workspace/k8s.io/', '-v', '/workspace/_artifacts:/workspace/artifacts', '-e', 'KUBE_FORCE_VERIFY_CHECKS=y', '-e', 'KUBE_VERIFY_GIT_BRANCH=master', '-e', 'EXCLUDE_TYPECHECK=n', '-e', 'EXCLUDE_GODEP=n', '-e', 'REPO_DIR=/workspace/k8s.io/kubernetes', '--tmpfs', '/tmp:exec,mode=1777', 'gcr.io/k8s-testimages/kubekins-test:1.13-v20190125-cc5d6ecff3', 'bash', '-c', 'cd kubernetes && ./hack/jenkins/test-dockerized.sh')' returned non-zero exit status 2
E0315 08:42:48.602] Command failed
I0315 08:42:48.602] process 488 exited with code 1 after 29.3m
E0315 08:42:48.602] FAIL: ci-kubernetes-integration-master
I0315 08:42:48.603] Call:  gcloud auth activate-service-account --key-file=/etc/service-account/service-account.json
W0315 08:42:49.107] Activated service account credentials for: [pr-kubekins@kubernetes-jenkins-pull.iam.gserviceaccount.com]
I0315 08:42:49.143] process 127831 exited with code 0 after 0.0m
I0315 08:42:49.143] Call:  gcloud config get-value account
I0315 08:42:49.444] process 127843 exited with code 0 after 0.0m
I0315 08:42:49.445] Will upload results to gs://kubernetes-jenkins/logs using pr-kubekins@kubernetes-jenkins-pull.iam.gserviceaccount.com
I0315 08:42:49.445] Upload result and artifacts...
I0315 08:42:49.445] Gubernator results at https://gubernator.k8s.io/build/kubernetes-jenkins/logs/ci-kubernetes-integration-master/9492
I0315 08:42:49.445] Call:  gsutil ls gs://kubernetes-jenkins/logs/ci-kubernetes-integration-master/9492/artifacts
W0315 08:42:50.539] CommandException: One or more URLs matched no objects.
E0315 08:42:50.658] Command failed
I0315 08:42:50.658] process 127855 exited with code 1 after 0.0m
W0315 08:42:50.658] Remote dir gs://kubernetes-jenkins/logs/ci-kubernetes-integration-master/9492/artifacts not exist yet
I0315 08:42:50.659] Call:  gsutil -m -q -o GSUtil:use_magicfile=True cp -r -c -z log,txt,xml /workspace/_artifacts gs://kubernetes-jenkins/logs/ci-kubernetes-integration-master/9492/artifacts
I0315 08:42:55.053] process 127997 exited with code 0 after 0.1m
W0315 08:42:55.054] metadata path /workspace/_artifacts/metadata.json does not exist
W0315 08:42:55.054] metadata not found or invalid, init with empty metadata
... skipping 15 lines ...