This job view page is being replaced by Spyglass soon. Check out the new job view.
PRneolit123: Revert "Speculative workaround for #74890"
ResultFAILURE
Tests 1 failed / 648 succeeded
Started2019-03-15 02:04
Elapsed28m28s
Revision
Buildergke-prow-containerd-pool-99179761-521q
Refs master:d5a3db00
75393:4eabc1cc
pod8dfe9172-46c6-11e9-b821-0a580a6c1069
infra-commitaaeae12bb
pod8dfe9172-46c6-11e9-b821-0a580a6c1069
repok8s.io/kubernetes
repo-commita2313b874e222b2a33fe704fa54470dc0a94ec2a
repos{u'k8s.io/kubernetes': u'master:d5a3db003916b1d33b503ccd2e4897e094d8af77,75393:4eabc1cc16118255ad8c8274d61e1e7a99f940f7'}

Test Failures


k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/test/integration TestStatusSubresource 2.73s

go test -v k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/test/integration -run TestStatusSubresource$
I0315 02:27:30.639509  126448 secure_serving.go:160] Stopped listening on 127.0.0.1:38543
I0315 02:27:30.640318  126448 serving.go:312] Generated self-signed cert (/tmp/apiextensions-apiserver615950570/apiserver.crt, /tmp/apiextensions-apiserver615950570/apiserver.key)
W0315 02:27:31.260070  126448 mutation_detector.go:48] Mutation detector is enabled, this will result in memory leakage.
I0315 02:27:31.261965  126448 clientconn.go:551] parsed scheme: ""
I0315 02:27:31.261997  126448 clientconn.go:557] scheme "" not registered, fallback to default scheme
I0315 02:27:31.262057  126448 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0315 02:27:31.262112  126448 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0315 02:27:31.262543  126448 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0315 02:27:31.262552  126448 clientconn.go:551] parsed scheme: ""
I0315 02:27:31.262581  126448 clientconn.go:557] scheme "" not registered, fallback to default scheme
I0315 02:27:31.262635  126448 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0315 02:27:31.262675  126448 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0315 02:27:31.263010  126448 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 02:27:31.264432  126448 mutation_detector.go:48] Mutation detector is enabled, this will result in memory leakage.
I0315 02:27:31.266171  126448 secure_serving.go:116] Serving securely on 127.0.0.1:42939
I0315 02:27:31.266518  126448 crd_finalizer.go:242] Starting CRDFinalizer
I0315 02:27:31.266617  126448 customresource_discovery_controller.go:208] Starting DiscoveryController
I0315 02:27:31.266636  126448 naming_controller.go:284] Starting NamingConditionController
I0315 02:27:31.266655  126448 establishing_controller.go:73] Starting EstablishingController
E0315 02:27:31.267343  126448 reflector.go:126] k8s.io/client-go/informers/factory.go:133: Failed to list *v1.Service: Get http://127.1.2.3:12345/api/v1/services?limit=500&resourceVersion=0: dial tcp 127.1.2.3:12345: connect: connection refused
I0315 02:27:31.640786  126448 clientconn.go:551] parsed scheme: ""
I0315 02:27:31.640825  126448 clientconn.go:557] scheme "" not registered, fallback to default scheme
I0315 02:27:31.640881  126448 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0315 02:27:31.640926  126448 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0315 02:27:31.641442  126448 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0315 02:27:32.176555  126448 clientconn.go:551] parsed scheme: ""
I0315 02:27:32.176590  126448 clientconn.go:557] scheme "" not registered, fallback to default scheme
I0315 02:27:32.176622  126448 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0315 02:27:32.176668  126448 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0315 02:27:32.177560  126448 clientconn.go:551] parsed scheme: ""
I0315 02:27:32.177588  126448 clientconn.go:557] scheme "" not registered, fallback to default scheme
I0315 02:27:32.177625  126448 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0315 02:27:32.177693  126448 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0315 02:27:32.177997  126448 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0315 02:27:32.178499  126448 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
E0315 02:27:32.268001  126448 reflector.go:126] k8s.io/client-go/informers/factory.go:133: Failed to list *v1.Service: Get http://127.1.2.3:12345/api/v1/services?limit=500&resourceVersion=0: dial tcp 127.1.2.3:12345: connect: connection refused
E0315 02:27:33.268725  126448 reflector.go:126] k8s.io/client-go/informers/factory.go:133: Failed to list *v1.Service: Get http://127.1.2.3:12345/api/v1/services?limit=500&resourceVersion=0: dial tcp 127.1.2.3:12345: connect: connection refused
I0315 02:27:33.287448  126448 clientconn.go:551] parsed scheme: ""
I0315 02:27:33.287479  126448 clientconn.go:557] scheme "" not registered, fallback to default scheme
I0315 02:27:33.287514  126448 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0315 02:27:33.287579  126448 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0315 02:27:33.288441  126448 clientconn.go:551] parsed scheme: ""
I0315 02:27:33.288462  126448 clientconn.go:557] scheme "" not registered, fallback to default scheme
I0315 02:27:33.288495  126448 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I0315 02:27:33.288565  126448 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0315 02:27:33.288819  126448 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I0315 02:27:33.289460  126448 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
testserver.go:141: runtime-config=map[api/all:true]
testserver.go:142: Starting apiextensions-apiserver on port 42939...
testserver.go:160: Waiting for /healthz to be ok...
subresources_test.go:197: unable to update status: WishIHadChosenNoxu.mygroup.example.com "foo" is invalid: apiVersion: Invalid value: "mygroup.example.com/v1beta1": must be mygroup.example.com/v1
				from junit_d431ed5f68ae4ddf888439fb96b687a923412204_20190315-021929.xml

Filter through log files | View test history on testgrid


Show 648 Passed Tests

Show 4 Skipped Tests

Error lines from build-log.txt

... skipping 306 lines ...
W0315 02:13:37.850] I0315 02:13:37.850072   55607 serving.go:312] Generated self-signed cert (/tmp/apiserver.crt, /tmp/apiserver.key)
W0315 02:13:37.851] I0315 02:13:37.850170   55607 server.go:559] external host was not specified, using 172.17.0.2
W0315 02:13:37.851] W0315 02:13:37.850184   55607 authentication.go:415] AnonymousAuth is not allowed with the AlwaysAllow authorizer. Resetting AnonymousAuth to false. You should use a different authorizer
W0315 02:13:37.851] I0315 02:13:37.850520   55607 server.go:146] Version: v1.15.0-alpha.0.1224+a2313b874e222b
W0315 02:13:38.521] I0315 02:13:38.520963   55607 plugins.go:158] Loaded 4 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority.
W0315 02:13:38.522] I0315 02:13:38.521002   55607 plugins.go:161] Loaded 4 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ResourceQuota.
W0315 02:13:38.522] E0315 02:13:38.521725   55607 prometheus.go:138] failed to register depth metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 02:13:38.522] E0315 02:13:38.521772   55607 prometheus.go:150] failed to register adds metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 02:13:38.523] E0315 02:13:38.521825   55607 prometheus.go:162] failed to register latency metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 02:13:38.523] E0315 02:13:38.521917   55607 prometheus.go:174] failed to register work_duration metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 02:13:38.523] E0315 02:13:38.521952   55607 prometheus.go:189] failed to register unfinished_work_seconds metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 02:13:38.523] E0315 02:13:38.521980   55607 prometheus.go:202] failed to register longest_running_processor_microseconds metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 02:13:38.523] I0315 02:13:38.522008   55607 plugins.go:158] Loaded 4 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority.
W0315 02:13:38.524] I0315 02:13:38.522021   55607 plugins.go:161] Loaded 4 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ResourceQuota.
W0315 02:13:38.525] I0315 02:13:38.525088   55607 clientconn.go:551] parsed scheme: ""
W0315 02:13:38.525] I0315 02:13:38.525126   55607 clientconn.go:557] scheme "" not registered, fallback to default scheme
W0315 02:13:38.525] I0315 02:13:38.525204   55607 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
W0315 02:13:38.526] I0315 02:13:38.525527   55607 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
... skipping 361 lines ...
W0315 02:13:39.067] W0315 02:13:39.066152   55607 genericapiserver.go:344] Skipping API storage.k8s.io/v1alpha1 because it has no resources.
W0315 02:13:39.519] I0315 02:13:39.519120   55607 clientconn.go:551] parsed scheme: ""
W0315 02:13:39.520] I0315 02:13:39.519163   55607 clientconn.go:557] scheme "" not registered, fallback to default scheme
W0315 02:13:39.520] I0315 02:13:39.519368   55607 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
W0315 02:13:39.520] I0315 02:13:39.519596   55607 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 02:13:39.521] I0315 02:13:39.520377   55607 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 02:13:39.875] E0315 02:13:39.875010   55607 prometheus.go:138] failed to register depth metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 02:13:39.876] E0315 02:13:39.875071   55607 prometheus.go:150] failed to register adds metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 02:13:39.876] E0315 02:13:39.875173   55607 prometheus.go:162] failed to register latency metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 02:13:39.877] E0315 02:13:39.875268   55607 prometheus.go:174] failed to register work_duration metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 02:13:39.877] E0315 02:13:39.875365   55607 prometheus.go:189] failed to register unfinished_work_seconds metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 02:13:39.877] E0315 02:13:39.875426   55607 prometheus.go:202] failed to register longest_running_processor_microseconds metric admission_quota_controller: duplicate metrics collector registration attempted
W0315 02:13:39.877] I0315 02:13:39.875474   55607 plugins.go:158] Loaded 4 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority.
W0315 02:13:39.878] I0315 02:13:39.875489   55607 plugins.go:161] Loaded 4 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,ResourceQuota.
W0315 02:13:39.878] I0315 02:13:39.877978   55607 clientconn.go:551] parsed scheme: ""
W0315 02:13:39.878] I0315 02:13:39.878010   55607 clientconn.go:557] scheme "" not registered, fallback to default scheme
W0315 02:13:39.878] I0315 02:13:39.878061   55607 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
W0315 02:13:39.879] I0315 02:13:39.878199   55607 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
... skipping 250 lines ...
W0315 02:14:16.080] I0315 02:14:16.075808   58898 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for podtemplates
W0315 02:14:16.080] I0315 02:14:16.075853   58898 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for rolebindings.rbac.authorization.k8s.io
W0315 02:14:16.080] I0315 02:14:16.075884   58898 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for leases.coordination.k8s.io
W0315 02:14:16.080] I0315 02:14:16.075911   58898 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for limitranges
W0315 02:14:16.080] I0315 02:14:16.075938   58898 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for daemonsets.extensions
W0315 02:14:16.080] I0315 02:14:16.075961   58898 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for replicasets.extensions
W0315 02:14:16.081] E0315 02:14:16.075979   58898 resource_quota_controller.go:171] initial monitor sync has error: couldn't start monitor for resource "extensions/v1beta1, Resource=networkpolicies": unable to monitor quota for resource "extensions/v1beta1, Resource=networkpolicies"
W0315 02:14:16.081] I0315 02:14:16.076005   58898 controllermanager.go:497] Started "resourcequota"
W0315 02:14:16.081] I0315 02:14:16.076039   58898 resource_quota_controller.go:276] Starting resource quota controller
W0315 02:14:16.081] I0315 02:14:16.076059   58898 controller_utils.go:1027] Waiting for caches to sync for resource quota controller
W0315 02:14:16.081] I0315 02:14:16.076213   58898 resource_quota_monitor.go:301] QuotaMonitor running
W0315 02:14:16.081] I0315 02:14:16.076747   58898 controllermanager.go:497] Started "horizontalpodautoscaling"
W0315 02:14:16.081] I0315 02:14:16.077007   58898 horizontal.go:156] Starting HPA controller
... skipping 45 lines ...
W0315 02:14:16.103] I0315 02:14:16.103005   58898 pv_protection_controller.go:81] Starting PV protection controller
W0315 02:14:16.103] I0315 02:14:16.103026   58898 controller_utils.go:1027] Waiting for caches to sync for PV protection controller
W0315 02:14:16.103] I0315 02:14:16.102940   58898 controller_utils.go:1027] Waiting for caches to sync for namespace controller
W0315 02:14:16.104] I0315 02:14:16.104112   58898 controllermanager.go:497] Started "disruption"
W0315 02:14:16.104] I0315 02:14:16.104236   58898 disruption.go:286] Starting disruption controller
W0315 02:14:16.104] I0315 02:14:16.104289   58898 controller_utils.go:1027] Waiting for caches to sync for disruption controller
W0315 02:14:16.106] E0315 02:14:16.105850   58898 core.go:77] Failed to start service controller: WARNING: no cloud provider provided, services of type LoadBalancer will fail
W0315 02:14:16.106] W0315 02:14:16.105883   58898 controllermanager.go:489] Skipping "service"
W0315 02:14:16.108] I0315 02:14:16.107676   58898 controllermanager.go:497] Started "persistentvolume-binder"
W0315 02:14:16.109] I0315 02:14:16.108225   58898 controllermanager.go:497] Started "persistentvolume-expander"
W0315 02:14:16.109] I0315 02:14:16.108764   58898 pv_controller_base.go:270] Starting persistent volume controller
W0315 02:14:16.109] I0315 02:14:16.108782   58898 controller_utils.go:1027] Waiting for caches to sync for persistent volume controller
W0315 02:14:16.109] I0315 02:14:16.108807   58898 expand_controller.go:153] Starting expand controller
... skipping 27 lines ...
W0315 02:14:16.852] I0315 02:14:16.517961   58898 controller_utils.go:1027] Waiting for caches to sync for garbage collector controller
W0315 02:14:16.852] I0315 02:14:16.517995   58898 graph_builder.go:308] GraphBuilder running
W0315 02:14:16.853] I0315 02:14:16.518324   58898 controllermanager.go:497] Started "csrcleaner"
W0315 02:14:16.853] W0315 02:14:16.518343   58898 controllermanager.go:476] "bootstrapsigner" is disabled
W0315 02:14:16.853] I0315 02:14:16.518505   58898 cleaner.go:81] Starting CSR cleaner controller
W0315 02:14:16.853] I0315 02:14:16.518740   58898 node_lifecycle_controller.go:77] Sending events to api server
W0315 02:14:16.853] E0315 02:14:16.518834   58898 core.go:161] failed to start cloud node lifecycle controller: no cloud provider provided
W0315 02:14:16.853] W0315 02:14:16.518843   58898 controllermanager.go:489] Skipping "cloud-node-lifecycle"
W0315 02:14:16.853] W0315 02:14:16.519873   58898 probe.go:268] Flexvolume plugin directory at /usr/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating.
W0315 02:14:16.853] I0315 02:14:16.521127   58898 controllermanager.go:497] Started "attachdetach"
W0315 02:14:16.854] W0315 02:14:16.521152   58898 controllermanager.go:489] Skipping "root-ca-cert-publisher"
W0315 02:14:16.854] I0315 02:14:16.521257   58898 attach_detach_controller.go:323] Starting attach detach controller
W0315 02:14:16.854] I0315 02:14:16.521281   58898 controller_utils.go:1027] Waiting for caches to sync for attach detach controller
... skipping 14 lines ...
W0315 02:14:16.855] I0315 02:14:16.583378   58898 controller_utils.go:1034] Caches are synced for GC controller
W0315 02:14:16.856] I0315 02:14:16.585912   58898 controller_utils.go:1034] Caches are synced for ClusterRoleAggregator controller
W0315 02:14:16.856] I0315 02:14:16.586606   58898 controller_utils.go:1034] Caches are synced for PVC protection controller
W0315 02:14:16.856] I0315 02:14:16.586015   58898 controller_utils.go:1034] Caches are synced for certificate controller
W0315 02:14:16.856] I0315 02:14:16.590473   58898 controller_utils.go:1034] Caches are synced for ReplicationController controller
W0315 02:14:16.856] I0315 02:14:16.605191   58898 controller_utils.go:1034] Caches are synced for namespace controller
W0315 02:14:16.856] E0315 02:14:16.608650   58898 clusterroleaggregation_controller.go:180] admin failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "admin": the object has been modified; please apply your changes to the latest version and try again
W0315 02:14:16.856] E0315 02:14:16.610814   58898 clusterroleaggregation_controller.go:180] edit failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "edit": the object has been modified; please apply your changes to the latest version and try again
W0315 02:14:16.857] I0315 02:14:16.624890   58898 controller_utils.go:1034] Caches are synced for stateful set controller
W0315 02:14:16.857] I0315 02:14:16.790518   58898 controller_utils.go:1034] Caches are synced for deployment controller
W0315 02:14:16.857] I0315 02:14:16.804570   58898 controller_utils.go:1034] Caches are synced for disruption controller
W0315 02:14:16.857] I0315 02:14:16.804618   58898 disruption.go:294] Sending events to api server.
W0315 02:14:16.857] I0315 02:14:16.823509   58898 controller_utils.go:1034] Caches are synced for ReplicaSet controller
W0315 02:14:16.923] I0315 02:14:16.922634   58898 controller_utils.go:1034] Caches are synced for endpoint controller
... skipping 8 lines ...
I0315 02:14:17.110] Successful: --output json has correct server info
I0315 02:14:17.110] +++ [0315 02:14:17] Testing kubectl version: verify json output using additional --client flag does not contain serverVersion
I0315 02:14:17.229] Successful: --client --output json has correct client info
I0315 02:14:17.238] Successful: --client --output json has no server info
I0315 02:14:17.242] +++ [0315 02:14:17] Testing kubectl version: compare json output using additional --short flag
W0315 02:14:17.377] I0315 02:14:17.376526   58898 controller_utils.go:1034] Caches are synced for resource quota controller
W0315 02:14:17.378] W0315 02:14:17.378321   58898 actual_state_of_world.go:503] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="127.0.0.1" does not exist
W0315 02:14:17.382] I0315 02:14:17.381526   58898 controller_utils.go:1034] Caches are synced for taint controller
W0315 02:14:17.382] I0315 02:14:17.381676   58898 node_lifecycle_controller.go:1159] Initializing eviction metric for zone: 
W0315 02:14:17.382] I0315 02:14:17.381805   58898 event.go:209] Event(v1.ObjectReference{Kind:"Node", Namespace:"", Name:"127.0.0.1", UID:"0855ccca-46c8-11e9-91a6-0242ac110002", APIVersion:"", ResourceVersion:"", FieldPath:""}): type: 'Normal' reason: 'RegisteredNode' Node 127.0.0.1 event: Registered Node 127.0.0.1 in Controller
W0315 02:14:17.382] I0315 02:14:17.381852   58898 node_lifecycle_controller.go:1009] Controller detected that all Nodes are not-Ready. Entering master disruption mode.
W0315 02:14:17.382] I0315 02:14:17.381681   58898 taint_manager.go:198] Starting NoExecuteTaintManager
W0315 02:14:17.403] I0315 02:14:17.403315   58898 controller_utils.go:1034] Caches are synced for daemon sets controller
... skipping 19 lines ...
I0315 02:14:17.648] +++ [0315 02:14:17] Creating namespace namespace-1552616057-14316
I0315 02:14:17.736] namespace/namespace-1552616057-14316 created
I0315 02:14:17.805] Context "test" modified.
I0315 02:14:17.812] +++ [0315 02:14:17] Testing kubectl(v1:config set)
I0315 02:14:17.880] Cluster "test-cluster" set.
I0315 02:14:17.951] Property "clusters.test-cluster.certificate-authority-data" set.
W0315 02:14:18.052] E0315 02:14:17.775230   58898 resource_quota_controller.go:437] failed to sync resource monitors: couldn't start monitor for resource "extensions/v1beta1, Resource=networkpolicies": unable to monitor quota for resource "extensions/v1beta1, Resource=networkpolicies"
W0315 02:14:18.053] I0315 02:14:18.010792   58898 controller_utils.go:1027] Waiting for caches to sync for garbage collector controller
W0315 02:14:18.112] I0315 02:14:18.112236   58898 controller_utils.go:1034] Caches are synced for garbage collector controller
I0315 02:14:18.213] Property "clusters.test-cluster.certificate-authority-data" set.
I0315 02:14:18.213] +++ exit code: 0
I0315 02:14:18.287] Recording: run_kubectl_local_proxy_tests
I0315 02:14:18.287] Running command: run_kubectl_local_proxy_tests
... skipping 33 lines ...
I0315 02:14:20.316] +++ [0315 02:14:20] Creating namespace namespace-1552616060-2159
I0315 02:14:20.389] namespace/namespace-1552616060-2159 created
I0315 02:14:20.458] Context "test" modified.
I0315 02:14:20.465] +++ [0315 02:14:20] Testing RESTMapper
W0315 02:14:20.566] I0315 02:14:20.436413   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:14:20.566] I0315 02:14:20.440519   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 02:14:20.667] +++ [0315 02:14:20] "kubectl get unknownresourcetype" returns error as expected: error: the server doesn't have a resource type "unknownresourcetype"
I0315 02:14:20.667] +++ exit code: 0
I0315 02:14:20.708] NAME                              SHORTNAMES   APIGROUP                       NAMESPACED   KIND
I0315 02:14:20.709] bindings                                                                      true         Binding
I0315 02:14:20.709] componentstatuses                 cs                                          false        ComponentStatus
I0315 02:14:20.709] configmaps                        cm                                          true         ConfigMap
I0315 02:14:20.709] endpoints                         ep                                          true         Endpoints
... skipping 677 lines ...
I0315 02:14:39.799] core.sh:194: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0315 02:14:40.044] core.sh:198: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0315 02:14:40.174] core.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0315 02:14:40.292] pod "valid-pod" force deleted
W0315 02:14:40.393] I0315 02:14:39.446753   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:14:40.394] I0315 02:14:39.450127   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:14:40.394] error: resource(s) were provided, but no name, label selector, or --all flag specified
W0315 02:14:40.394] error: setting 'all' parameter but found a non empty selector. 
W0315 02:14:40.394] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0315 02:14:40.448] I0315 02:14:40.447537   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:14:40.451] I0315 02:14:40.450633   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 02:14:40.552] core.sh:206: Successful get pods -l'name in (valid-pod)' {{range.items}}{{$id_field}}:{{end}}: 
I0315 02:14:40.563] core.sh:211: Successful get namespaces {{range.items}}{{ if eq $id_field \"test-kubectl-describe-pod\" }}found{{end}}{{end}}:: :
I0315 02:14:40.659] namespace/test-kubectl-describe-pod created
... skipping 15 lines ...
I0315 02:14:42.142] core.sh:255: Successful get pdb/test-pdb-4 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 50%
I0315 02:14:42.309] core.sh:261: Successful get pods --namespace=test-kubectl-describe-pod {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 02:14:42.513] pod/env-test-pod created
W0315 02:14:42.614] I0315 02:14:41.448128   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:14:42.614] I0315 02:14:41.451093   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:14:42.614] I0315 02:14:41.510914   55607 controller.go:606] quota admission added evaluator for: poddisruptionbudgets.policy
W0315 02:14:42.615] error: min-available and max-unavailable cannot be both specified
W0315 02:14:42.615] I0315 02:14:42.448835   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:14:42.615] I0315 02:14:42.451581   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 02:14:42.716] core.sh:264: Successful describe pods --namespace=test-kubectl-describe-pod env-test-pod:
I0315 02:14:42.716] Name:               env-test-pod
I0315 02:14:42.716] Namespace:          test-kubectl-describe-pod
I0315 02:14:42.716] Priority:           0
... skipping 173 lines ...
W0315 02:14:55.473] I0315 02:14:55.458011   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 02:14:55.574] replicationcontroller "modified" deleted
I0315 02:14:55.791] core.sh:434: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 02:14:55.994] pod/valid-pod created
I0315 02:14:56.113] core.sh:438: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0315 02:14:56.296] Successful
I0315 02:14:56.297] message:Error from server: cannot restore map from string
I0315 02:14:56.297] has:cannot restore map from string
W0315 02:14:56.397] E0315 02:14:56.284992   55607 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"cannot restore map from string"}
W0315 02:14:56.459] I0315 02:14:56.458598   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:14:56.459] I0315 02:14:56.458860   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 02:14:56.560] Successful
I0315 02:14:56.560] message:pod/valid-pod patched (no change)
I0315 02:14:56.560] has:patched (no change)
I0315 02:14:56.561] pod/valid-pod patched
... skipping 6 lines ...
I0315 02:14:57.257] pod/valid-pod patched
I0315 02:14:57.370] core.sh:470: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: changed-with-yaml:
I0315 02:14:57.457] pod/valid-pod patched
I0315 02:14:57.555] core.sh:475: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:3.1:
I0315 02:14:57.721] pod/valid-pod patched
I0315 02:14:57.821] core.sh:491: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
I0315 02:14:58.005] +++ [0315 02:14:58] "kubectl patch with resourceVersion 500" returns error as expected: Error from server (Conflict): Operation cannot be fulfilled on pods "valid-pod": the object has been modified; please apply your changes to the latest version and try again
W0315 02:14:58.106] I0315 02:14:57.459216   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:14:58.107] I0315 02:14:57.459433   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 02:14:58.272] pod "valid-pod" deleted
I0315 02:14:58.286] pod/valid-pod replaced
I0315 02:14:58.388] core.sh:515: Successful get pod valid-pod {{(index .spec.containers 0).name}}: replaced-k8s-serve-hostname
I0315 02:14:58.567] Successful
I0315 02:14:58.567] message:error: --grace-period must have --force specified
I0315 02:14:58.567] has:\-\-grace-period must have \-\-force specified
W0315 02:14:58.668] I0315 02:14:58.459684   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:14:58.668] I0315 02:14:58.459889   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 02:14:58.769] Successful
I0315 02:14:58.769] message:error: --timeout must have --force specified
I0315 02:14:58.769] has:\-\-timeout must have \-\-force specified
I0315 02:14:58.926] node/node-v1-test created
W0315 02:14:59.026] W0315 02:14:58.927303   58898 actual_state_of_world.go:503] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="node-v1-test" does not exist
I0315 02:14:59.127] node/node-v1-test replaced
I0315 02:14:59.212] core.sh:552: Successful get node node-v1-test {{.metadata.annotations.a}}: b
I0315 02:14:59.300] node "node-v1-test" deleted
I0315 02:14:59.426] core.sh:559: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
I0315 02:14:59.733] core.sh:562: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/serve_hostname:
I0315 02:15:00.817] core.sh:575: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
... skipping 20 lines ...
W0315 02:15:01.583] Edit cancelled, no changes made.
W0315 02:15:01.584] Edit cancelled, no changes made.
W0315 02:15:01.584] I0315 02:15:00.461273   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:15:01.584] I0315 02:15:00.461598   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:15:01.584] Edit cancelled, no changes made.
W0315 02:15:01.584] Edit cancelled, no changes made.
W0315 02:15:01.585] error: 'name' already has a value (valid-pod), and --overwrite is false
W0315 02:15:01.585] I0315 02:15:01.463503   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:15:01.585] I0315 02:15:01.463688   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 02:15:01.685] core.sh:597: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod-super-sayan
I0315 02:15:01.714] core.sh:601: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0315 02:15:01.824] pod "valid-pod" force deleted
W0315 02:15:01.924] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
... skipping 96 lines ...
I0315 02:15:08.780] +++ Running case: test-cmd.run_kubectl_create_error_tests 
I0315 02:15:08.782] +++ working dir: /go/src/k8s.io/kubernetes
I0315 02:15:08.785] +++ command: run_kubectl_create_error_tests
I0315 02:15:08.798] +++ [0315 02:15:08] Creating namespace namespace-1552616108-11475
I0315 02:15:08.868] namespace/namespace-1552616108-11475 created
I0315 02:15:08.939] Context "test" modified.
I0315 02:15:08.946] +++ [0315 02:15:08] Testing kubectl create with error
W0315 02:15:09.046] I0315 02:15:08.467745   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:15:09.047] I0315 02:15:08.467958   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:15:09.047] Error: must specify one of -f and -k
W0315 02:15:09.047] 
W0315 02:15:09.047] Create a resource from a file or from stdin.
W0315 02:15:09.047] 
W0315 02:15:09.047]  JSON and YAML formats are accepted.
W0315 02:15:09.048] 
W0315 02:15:09.048] Examples:
... skipping 41 lines ...
W0315 02:15:09.053] 
W0315 02:15:09.053] Usage:
W0315 02:15:09.053]   kubectl create -f FILENAME [options]
W0315 02:15:09.054] 
W0315 02:15:09.054] Use "kubectl <command> --help" for more information about a given command.
W0315 02:15:09.054] Use "kubectl options" for a list of global command-line options (applies to all commands).
I0315 02:15:09.182] +++ [0315 02:15:09] "kubectl create with empty string list returns error as expected: error: error validating "hack/testdata/invalid-rc-with-empty-args.yaml": error validating data: ValidationError(ReplicationController.spec.template.spec.containers[0].args): unknown object type "nil" in ReplicationController.spec.template.spec.containers[0].args[0]; if you choose to ignore these errors, turn validation off with --validate=false
W0315 02:15:09.283] kubectl convert is DEPRECATED and will be removed in a future version.
W0315 02:15:09.283] In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
I0315 02:15:09.384] +++ exit code: 0
I0315 02:15:09.401] Recording: run_kubectl_apply_tests
I0315 02:15:09.402] Running command: run_kubectl_apply_tests
I0315 02:15:09.422] 
... skipping 35 lines ...
I0315 02:15:11.781] Running command: run_kubectl_run_tests
I0315 02:15:11.803] 
I0315 02:15:11.806] +++ Running case: test-cmd.run_kubectl_run_tests 
I0315 02:15:11.809] +++ working dir: /go/src/k8s.io/kubernetes
I0315 02:15:11.812] +++ command: run_kubectl_run_tests
I0315 02:15:11.823] +++ [0315 02:15:11] Creating namespace namespace-1552616111-18419
W0315 02:15:11.924] Error from server (NotFound): resources.mygroup.example.com "myobj" not found
I0315 02:15:12.025] namespace/namespace-1552616111-18419 created
I0315 02:15:12.027] Context "test" modified.
I0315 02:15:12.036] +++ [0315 02:15:12] Testing kubectl run
I0315 02:15:12.162] run.sh:29: Successful get jobs {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 02:15:12.309] job.batch/pi created
W0315 02:15:12.410] kubectl run --generator=job/v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
... skipping 95 lines ...
I0315 02:15:15.576] Context "test" modified.
I0315 02:15:15.577] +++ [0315 02:15:15] Testing kubectl create filter
I0315 02:15:15.632] create.sh:30: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 02:15:15.869] pod/selector-test-pod created
I0315 02:15:16.008] create.sh:34: Successful get pods selector-test-pod {{.metadata.labels.name}}: selector-test-pod
I0315 02:15:16.131] Successful
I0315 02:15:16.132] message:Error from server (NotFound): pods "selector-test-pod-dont-apply" not found
I0315 02:15:16.133] has:pods "selector-test-pod-dont-apply" not found
I0315 02:15:16.260] pod "selector-test-pod" deleted
I0315 02:15:16.283] +++ exit code: 0
I0315 02:15:16.345] Recording: run_kubectl_apply_deployments_tests
I0315 02:15:16.346] Running command: run_kubectl_apply_deployments_tests
I0315 02:15:16.368] 
... skipping 47 lines ...
W0315 02:15:20.413] I0315 02:15:20.330439   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552616116-6213", Name:"nginx-776cc67f78", UID:"2ebe20d7-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"600", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-776cc67f78-6blf7
W0315 02:15:20.413] I0315 02:15:20.333493   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552616116-6213", Name:"nginx-776cc67f78", UID:"2ebe20d7-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"600", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-776cc67f78-cv2gm
W0315 02:15:20.479] I0315 02:15:20.478363   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:15:20.479] I0315 02:15:20.479553   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 02:15:20.580] apps.sh:147: Successful get deployment nginx {{.metadata.name}}: nginx
I0315 02:15:24.877] Successful
I0315 02:15:24.877] message:Error from server (Conflict): error when applying patch:
I0315 02:15:24.878] {"metadata":{"annotations":{"kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"extensions/v1beta1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1552616116-6213\",\"resourceVersion\":\"99\"},\"spec\":{\"replicas\":3,\"selector\":{\"matchLabels\":{\"name\":\"nginx2\"}},\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx2\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n"},"resourceVersion":"99"},"spec":{"selector":{"matchLabels":{"name":"nginx2"}},"template":{"metadata":{"labels":{"name":"nginx2"}}}}}
I0315 02:15:24.878] to:
I0315 02:15:24.878] Resource: "extensions/v1beta1, Resource=deployments", GroupVersionKind: "extensions/v1beta1, Kind=Deployment"
I0315 02:15:24.878] Name: "nginx", Namespace: "namespace-1552616116-6213"
I0315 02:15:24.880] Object: &{map["metadata":map["namespace":"namespace-1552616116-6213" "selfLink":"/apis/extensions/v1beta1/namespaces/namespace-1552616116-6213/deployments/nginx" "generation":'\x01' "labels":map["name":"nginx"] "managedFields":[map["fields":map["f:metadata":map["f:annotations":map["f:deployment.kubernetes.io/revision":map[]]] "f:status":map["f:replicas":map[] "f:unavailableReplicas":map[] "f:updatedReplicas":map[] "f:conditions":map[".":map[] "k:{\"type\":\"Available\"}":map["f:lastTransitionTime":map[] "f:lastUpdateTime":map[] "f:message":map[] "f:reason":map[] "f:status":map[] "f:type":map[] ".":map[]]] "f:observedGeneration":map[]]] "manager":"kube-controller-manager" "operation":"Update" "apiVersion":"apps/v1" "time":"2019-03-15T02:15:20Z"] map["manager":"kubectl" "operation":"Update" "apiVersion":"extensions/v1beta1" "time":"2019-03-15T02:15:20Z" "fields":map["f:metadata":map["f:annotations":map[".":map[] "f:kubectl.kubernetes.io/last-applied-configuration":map[]] "f:labels":map[".":map[] "f:name":map[]]] "f:spec":map["f:replicas":map[] "f:revisionHistoryLimit":map[] "f:selector":map[".":map[] "f:matchLabels":map[".":map[] "f:name":map[]]] "f:strategy":map["f:rollingUpdate":map[".":map[] "f:maxSurge":map[] "f:maxUnavailable":map[]] "f:type":map[]] "f:template":map["f:metadata":map["f:labels":map[".":map[] "f:name":map[]]] "f:spec":map["f:securityContext":map[] "f:terminationGracePeriodSeconds":map[] "f:containers":map["k:{\"name\":\"nginx\"}":map["f:imagePullPolicy":map[] "f:name":map[] "f:ports":map[".":map[] "k:{\"containerPort\":80,\"protocol\":\"TCP\"}":map[".":map[] "f:containerPort":map[] "f:protocol":map[]]] "f:resources":map[] "f:terminationMessagePath":map[] "f:terminationMessagePolicy":map[] ".":map[] "f:image":map[]]] "f:dnsPolicy":map[] "f:restartPolicy":map[] "f:schedulerName":map[]]] "f:progressDeadlineSeconds":map[]]]]] "name":"nginx" "uid":"2ebcdd47-46c8-11e9-91a6-0242ac110002" "resourceVersion":"612" "creationTimestamp":"2019-03-15T02:15:20Z" "annotations":map["kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"extensions/v1beta1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1552616116-6213\"},\"spec\":{\"replicas\":3,\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx1\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n" "deployment.kubernetes.io/revision":"1"]] "spec":map["template":map["metadata":map["creationTimestamp":<nil> "labels":map["name":"nginx1"]] "spec":map["containers":[map["image":"k8s.gcr.io/nginx:test-cmd" "ports":[map["protocol":"TCP" "containerPort":'P']] "resources":map[] "terminationMessagePath":"/dev/termination-log" "terminationMessagePolicy":"File" "imagePullPolicy":"IfNotPresent" "name":"nginx"]] "restartPolicy":"Always" "terminationGracePeriodSeconds":'\x1e' "dnsPolicy":"ClusterFirst" "securityContext":map[] "schedulerName":"default-scheduler"]] "strategy":map["type":"RollingUpdate" "rollingUpdate":map["maxUnavailable":'\x01' "maxSurge":'\x01']] "revisionHistoryLimit":%!q(int64=+2147483647) "progressDeadlineSeconds":%!q(int64=+2147483647) "replicas":'\x03' "selector":map["matchLabels":map["name":"nginx1"]]] "status":map["unavailableReplicas":'\x03' "conditions":[map["type":"Available" "status":"False" "lastUpdateTime":"2019-03-15T02:15:20Z" "lastTransitionTime":"2019-03-15T02:15:20Z" "reason":"MinimumReplicasUnavailable" "message":"Deployment does not have minimum availability."]] "observedGeneration":'\x01' "replicas":'\x03' "updatedReplicas":'\x03'] "kind":"Deployment" "apiVersion":"extensions/v1beta1"]}
I0315 02:15:24.881] for: "hack/testdata/deployment-label-change2.yaml": Operation cannot be fulfilled on deployments.extensions "nginx": the object has been modified; please apply your changes to the latest version and try again
I0315 02:15:24.881] has:Error from server (Conflict)
W0315 02:15:24.981] I0315 02:15:21.479862   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:15:24.982] I0315 02:15:21.480501   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:15:24.982] I0315 02:15:22.480433   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:15:24.982] I0315 02:15:22.480662   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:15:24.982] I0315 02:15:23.176443   58898 horizontal.go:320] Horizontal Pod Autoscaler frontend has been deleted in namespace-1552616106-23420
W0315 02:15:24.983] I0315 02:15:23.480999   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
... skipping 5 lines ...
W0315 02:15:26.483] I0315 02:15:26.482743   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:15:26.483] I0315 02:15:26.482963   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:15:27.483] I0315 02:15:27.483263   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:15:27.484] I0315 02:15:27.483610   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:15:28.484] I0315 02:15:28.483867   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:15:28.484] I0315 02:15:28.484140   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:15:29.176] E0315 02:15:29.176165   58898 replica_set.go:450] Sync "namespace-1552616116-6213/nginx-776cc67f78" failed with replicasets.apps "nginx-776cc67f78" not found
W0315 02:15:29.182] E0315 02:15:29.181796   58898 replica_set.go:450] Sync "namespace-1552616116-6213/nginx-776cc67f78" failed with replicasets.apps "nginx-776cc67f78" not found
W0315 02:15:29.485] I0315 02:15:29.484434   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:15:29.485] I0315 02:15:29.484638   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 02:15:30.148] deployment.extensions/nginx configured
W0315 02:15:30.248] I0315 02:15:30.153450   58898 event.go:209] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1552616116-6213", Name:"nginx", UID:"349a4dad-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"633", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-7bd4fbc645 to 3
W0315 02:15:30.249] I0315 02:15:30.157726   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552616116-6213", Name:"nginx-7bd4fbc645", UID:"349b2768-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"634", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-7bd4fbc645-bhpgd
W0315 02:15:30.249] I0315 02:15:30.162142   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552616116-6213", Name:"nginx-7bd4fbc645", UID:"349b2768-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"634", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-7bd4fbc645-q25q7
... skipping 186 lines ...
I0315 02:15:37.745] +++ [0315 02:15:37] Creating namespace namespace-1552616137-13274
I0315 02:15:37.822] namespace/namespace-1552616137-13274 created
I0315 02:15:37.892] Context "test" modified.
I0315 02:15:37.900] +++ [0315 02:15:37] Testing kubectl get
I0315 02:15:37.994] get.sh:29: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 02:15:38.087] Successful
I0315 02:15:38.087] message:Error from server (NotFound): pods "abc" not found
I0315 02:15:38.087] has:pods "abc" not found
I0315 02:15:38.188] get.sh:37: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 02:15:38.282] Successful
I0315 02:15:38.282] message:Error from server (NotFound): pods "abc" not found
I0315 02:15:38.282] has:pods "abc" not found
I0315 02:15:38.380] get.sh:45: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 02:15:38.472] Successful
I0315 02:15:38.473] message:{
I0315 02:15:38.473]     "apiVersion": "v1",
I0315 02:15:38.473]     "items": [],
... skipping 23 lines ...
I0315 02:15:38.842] has not:No resources found
I0315 02:15:38.935] Successful
I0315 02:15:38.935] message:NAME
I0315 02:15:38.935] has not:No resources found
I0315 02:15:39.042] get.sh:73: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 02:15:39.157] Successful
I0315 02:15:39.157] message:error: the server doesn't have a resource type "foobar"
I0315 02:15:39.157] has not:No resources found
I0315 02:15:39.247] Successful
I0315 02:15:39.248] message:No resources found.
I0315 02:15:39.248] has:No resources found
I0315 02:15:39.333] Successful
I0315 02:15:39.334] message:
I0315 02:15:39.334] has not:No resources found
I0315 02:15:39.425] Successful
I0315 02:15:39.426] message:No resources found.
I0315 02:15:39.426] has:No resources found
I0315 02:15:39.518] get.sh:93: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 02:15:39.613] Successful
I0315 02:15:39.613] message:Error from server (NotFound): pods "abc" not found
I0315 02:15:39.614] has:pods "abc" not found
I0315 02:15:39.615] FAIL!
I0315 02:15:39.616] message:Error from server (NotFound): pods "abc" not found
I0315 02:15:39.616] has not:List
I0315 02:15:39.616] 99 /go/src/k8s.io/kubernetes/test/cmd/../../test/cmd/get.sh
W0315 02:15:39.716] I0315 02:15:38.489758   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:15:39.717] I0315 02:15:38.489951   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:15:39.717] I0315 02:15:39.490268   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:15:39.717] I0315 02:15:39.490606   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
... skipping 717 lines ...
I0315 02:15:43.238] }
I0315 02:15:43.338] get.sh:155: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0315 02:15:43.576] <no value>Successful
I0315 02:15:43.576] message:valid-pod:
I0315 02:15:43.576] has:valid-pod:
I0315 02:15:43.663] Successful
I0315 02:15:43.663] message:error: error executing jsonpath "{.missing}": Error executing template: missing is not found. Printing more information for debugging the template:
I0315 02:15:43.664] 	template was:
I0315 02:15:43.664] 		{.missing}
I0315 02:15:43.664] 	object given to jsonpath engine was:
I0315 02:15:43.665] 		map[string]interface {}{"spec":map[string]interface {}{"securityContext":map[string]interface {}{}, "schedulerName":"default-scheduler", "priority":0, "enableServiceLinks":true, "containers":[]interface {}{map[string]interface {}{"name":"kubernetes-serve-hostname", "image":"k8s.gcr.io/serve_hostname", "resources":map[string]interface {}{"requests":map[string]interface {}{"memory":"512Mi", "cpu":"1"}, "limits":map[string]interface {}{"memory":"512Mi", "cpu":"1"}}, "terminationMessagePath":"/dev/termination-log", "terminationMessagePolicy":"File", "imagePullPolicy":"Always"}}, "restartPolicy":"Always", "terminationGracePeriodSeconds":30, "dnsPolicy":"ClusterFirst"}, "status":map[string]interface {}{"phase":"Pending", "qosClass":"Guaranteed"}, "kind":"Pod", "apiVersion":"v1", "metadata":map[string]interface {}{"creationTimestamp":"2019-03-15T02:15:43Z", "labels":map[string]interface {}{"name":"valid-pod"}, "managedFields":[]interface {}{map[string]interface {}{"manager":"kubectl", "operation":"Update", "apiVersion":"v1", "time":"2019-03-15T02:15:43Z", "fields":map[string]interface {}{"f:metadata":map[string]interface {}{"f:labels":map[string]interface {}{".":map[string]interface {}{}, "f:name":map[string]interface {}{}}}, "f:spec":map[string]interface {}{"f:restartPolicy":map[string]interface {}{}, "f:schedulerName":map[string]interface {}{}, "f:securityContext":map[string]interface {}{}, "f:terminationGracePeriodSeconds":map[string]interface {}{}, "f:containers":map[string]interface {}{"k:{\"name\":\"kubernetes-serve-hostname\"}":map[string]interface {}{"f:image":map[string]interface {}{}, "f:imagePullPolicy":map[string]interface {}{}, "f:name":map[string]interface {}{}, "f:resources":map[string]interface {}{".":map[string]interface {}{}, "f:limits":map[string]interface {}{".":map[string]interface {}{}, "f:cpu":map[string]interface {}{}, "f:memory":map[string]interface {}{}}, "f:requests":map[string]interface {}{".":map[string]interface {}{}, "f:cpu":map[string]interface {}{}, "f:memory":map[string]interface {}{}}}, "f:terminationMessagePath":map[string]interface {}{}, "f:terminationMessagePolicy":map[string]interface {}{}, ".":map[string]interface {}{}}}, "f:dnsPolicy":map[string]interface {}{}, "f:enableServiceLinks":map[string]interface {}{}, "f:priority":map[string]interface {}{}}}}}, "name":"valid-pod", "namespace":"namespace-1552616142-17750", "selfLink":"/api/v1/namespaces/namespace-1552616142-17750/pods/valid-pod", "uid":"3c5860fb-46c8-11e9-91a6-0242ac110002", "resourceVersion":"704"}}
I0315 02:15:43.666] has:missing is not found
I0315 02:15:43.744] Successful
I0315 02:15:43.744] message:Error executing template: template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing". Printing more information for debugging the template:
I0315 02:15:43.744] 	template was:
I0315 02:15:43.744] 		{{.missing}}
I0315 02:15:43.745] 	raw data was:
I0315 02:15:43.746] 		{"apiVersion":"v1","kind":"Pod","metadata":{"creationTimestamp":"2019-03-15T02:15:43Z","labels":{"name":"valid-pod"},"managedFields":[{"apiVersion":"v1","fields":{"f:metadata":{"f:labels":{".":{},"f:name":{}}},"f:spec":{"f:containers":{"k:{\"name\":\"kubernetes-serve-hostname\"}":{".":{},"f:image":{},"f:imagePullPolicy":{},"f:name":{},"f:resources":{".":{},"f:limits":{".":{},"f:cpu":{},"f:memory":{}},"f:requests":{".":{},"f:cpu":{},"f:memory":{}}},"f:terminationMessagePath":{},"f:terminationMessagePolicy":{}}},"f:dnsPolicy":{},"f:enableServiceLinks":{},"f:priority":{},"f:restartPolicy":{},"f:schedulerName":{},"f:securityContext":{},"f:terminationGracePeriodSeconds":{}}},"manager":"kubectl","operation":"Update","time":"2019-03-15T02:15:43Z"}],"name":"valid-pod","namespace":"namespace-1552616142-17750","resourceVersion":"704","selfLink":"/api/v1/namespaces/namespace-1552616142-17750/pods/valid-pod","uid":"3c5860fb-46c8-11e9-91a6-0242ac110002"},"spec":{"containers":[{"image":"k8s.gcr.io/serve_hostname","imagePullPolicy":"Always","name":"kubernetes-serve-hostname","resources":{"limits":{"cpu":"1","memory":"512Mi"},"requests":{"cpu":"1","memory":"512Mi"}},"terminationMessagePath":"/dev/termination-log","terminationMessagePolicy":"File"}],"dnsPolicy":"ClusterFirst","enableServiceLinks":true,"priority":0,"restartPolicy":"Always","schedulerName":"default-scheduler","securityContext":{},"terminationGracePeriodSeconds":30},"status":{"phase":"Pending","qosClass":"Guaranteed"}}
I0315 02:15:43.746] 	object given to template engine was:
I0315 02:15:43.747] 		map[kind:Pod metadata:map[resourceVersion:704 selfLink:/api/v1/namespaces/namespace-1552616142-17750/pods/valid-pod uid:3c5860fb-46c8-11e9-91a6-0242ac110002 creationTimestamp:2019-03-15T02:15:43Z labels:map[name:valid-pod] managedFields:[map[apiVersion:v1 fields:map[f:spec:map[f:priority:map[] f:restartPolicy:map[] f:schedulerName:map[] f:securityContext:map[] f:terminationGracePeriodSeconds:map[] f:containers:map[k:{"name":"kubernetes-serve-hostname"}:map[f:terminationMessagePolicy:map[] .:map[] f:image:map[] f:imagePullPolicy:map[] f:name:map[] f:resources:map[.:map[] f:limits:map[.:map[] f:cpu:map[] f:memory:map[]] f:requests:map[f:memory:map[] .:map[] f:cpu:map[]]] f:terminationMessagePath:map[]]] f:dnsPolicy:map[] f:enableServiceLinks:map[]] f:metadata:map[f:labels:map[.:map[] f:name:map[]]]] manager:kubectl operation:Update time:2019-03-15T02:15:43Z]] name:valid-pod namespace:namespace-1552616142-17750] spec:map[containers:[map[imagePullPolicy:Always name:kubernetes-serve-hostname resources:map[limits:map[cpu:1 memory:512Mi] requests:map[cpu:1 memory:512Mi]] terminationMessagePath:/dev/termination-log terminationMessagePolicy:File image:k8s.gcr.io/serve_hostname]] dnsPolicy:ClusterFirst enableServiceLinks:true priority:0 restartPolicy:Always schedulerName:default-scheduler securityContext:map[] terminationGracePeriodSeconds:30] status:map[phase:Pending qosClass:Guaranteed] apiVersion:v1]
I0315 02:15:43.747] has:map has no entry for key "missing"
W0315 02:15:43.848] I0315 02:15:43.492636   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:15:43.848] I0315 02:15:43.492832   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:15:43.848] error: error executing template "{{.missing}}": template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing"
W0315 02:15:44.493] I0315 02:15:44.493117   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:15:44.494] I0315 02:15:44.493354   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:15:44.818] E0315 02:15:44.817532   70332 streamwatcher.go:109] Unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)
I0315 02:15:44.918] Successful
I0315 02:15:44.919] message:NAME        READY   STATUS    RESTARTS   AGE
I0315 02:15:44.919] valid-pod   0/1     Pending   0          0s
... skipping 158 lines ...
I0315 02:15:47.111]   terminationGracePeriodSeconds: 30
I0315 02:15:47.111] status:
I0315 02:15:47.112]   phase: Pending
I0315 02:15:47.112]   qosClass: Guaranteed
I0315 02:15:47.112] has:name: valid-pod
I0315 02:15:47.112] Successful
I0315 02:15:47.112] message:Error from server (NotFound): pods "invalid-pod" not found
I0315 02:15:47.112] has:"invalid-pod" not found
I0315 02:15:47.178] pod "valid-pod" deleted
I0315 02:15:47.269] get.sh:193: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 02:15:47.451] pod/redis-master created
I0315 02:15:47.457] pod/valid-pod created
I0315 02:15:47.550] Successful
... skipping 293 lines ...
I0315 02:15:54.408] Running command: run_create_secret_tests
I0315 02:15:54.432] 
I0315 02:15:54.435] +++ Running case: test-cmd.run_create_secret_tests 
I0315 02:15:54.438] +++ working dir: /go/src/k8s.io/kubernetes
I0315 02:15:54.441] +++ command: run_create_secret_tests
I0315 02:15:54.535] Successful
I0315 02:15:54.536] message:Error from server (NotFound): secrets "mysecret" not found
I0315 02:15:54.536] has:secrets "mysecret" not found
W0315 02:15:54.636] I0315 02:15:54.498507   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:15:54.676] I0315 02:15:54.498780   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 02:15:54.776] Successful
I0315 02:15:54.777] message:Error from server (NotFound): secrets "mysecret" not found
I0315 02:15:54.777] has:secrets "mysecret" not found
I0315 02:15:55.223] Successful
I0315 02:15:55.224] message:user-specified
I0315 02:15:55.224] has:user-specified
I0315 02:15:55.224] Successful
I0315 02:15:55.224] {"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-create-cm","namespace":"default","selfLink":"/api/v1/namespaces/default/configmaps/tester-create-cm","uid":"43556b3a-46c8-11e9-91a6-0242ac110002","resourceVersion":"812","creationTimestamp":"2019-03-15T02:15:54Z"}}
... skipping 182 lines ...
I0315 02:16:00.928] has:Timeout exceeded while reading body
I0315 02:16:01.014] Successful
I0315 02:16:01.014] message:NAME        READY   STATUS    RESTARTS   AGE
I0315 02:16:01.015] valid-pod   0/1     Pending   0          2s
I0315 02:16:01.015] has:valid-pod
I0315 02:16:01.091] Successful
I0315 02:16:01.092] message:error: Invalid timeout value. Timeout must be a single integer in seconds, or an integer followed by a corresponding time unit (e.g. 1s | 2m | 3h)
I0315 02:16:01.092] has:Invalid timeout value
I0315 02:16:01.199] pod "valid-pod" deleted
I0315 02:16:01.223] +++ exit code: 0
W0315 02:16:01.503] I0315 02:16:01.502870   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:16:01.504] I0315 02:16:01.503098   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 02:16:02.225] Recording: run_crd_tests
... skipping 245 lines ...
I0315 02:16:07.013] foo.company.com/test patched
I0315 02:16:07.120] crd.sh:237: Successful get foos/test {{.patched}}: value1
I0315 02:16:07.214] foo.company.com/test patched
I0315 02:16:07.316] crd.sh:239: Successful get foos/test {{.patched}}: value2
I0315 02:16:07.404] foo.company.com/test patched
I0315 02:16:07.506] crd.sh:241: Successful get foos/test {{.patched}}: <no value>
I0315 02:16:07.672] +++ [0315 02:16:07] "kubectl patch --local" returns error as expected for CustomResource: error: cannot apply strategic merge patch for company.com/v1, Kind=Foo locally, try --type merge
I0315 02:16:07.744] {
I0315 02:16:07.744]     "apiVersion": "company.com/v1",
I0315 02:16:07.744]     "kind": "Foo",
I0315 02:16:07.744]     "metadata": {
I0315 02:16:07.744]         "annotations": {
I0315 02:16:07.745]             "kubernetes.io/change-cause": "kubectl patch foos/test --server=http://127.0.0.1:8080 --match-server-version=true --patch={\"patched\":null} --type=merge --record=true"
... skipping 279 lines ...
W0315 02:16:16.519] I0315 02:16:16.518358   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:16:16.520] I0315 02:16:16.518776   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:16:17.520] I0315 02:16:17.519353   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:16:17.521] I0315 02:16:17.519740   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:16:18.521] I0315 02:16:18.520107   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:16:18.521] I0315 02:16:18.520358   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:16:18.695] E0315 02:16:18.693842   58898 resource_quota_controller.go:437] failed to sync resource monitors: [couldn't start monitor for resource "extensions/v1beta1, Resource=networkpolicies": unable to monitor quota for resource "extensions/v1beta1, Resource=networkpolicies", couldn't start monitor for resource "mygroup.example.com/v1alpha1, Resource=resources": unable to monitor quota for resource "mygroup.example.com/v1alpha1, Resource=resources", couldn't start monitor for resource "company.com/v1, Resource=validfoos": unable to monitor quota for resource "company.com/v1, Resource=validfoos", couldn't start monitor for resource "company.com/v1, Resource=bars": unable to monitor quota for resource "company.com/v1, Resource=bars", couldn't start monitor for resource "company.com/v1, Resource=foos": unable to monitor quota for resource "company.com/v1, Resource=foos"]
W0315 02:16:19.327] I0315 02:16:19.326317   58898 controller_utils.go:1027] Waiting for caches to sync for garbage collector controller
W0315 02:16:19.328] I0315 02:16:19.328084   55607 clientconn.go:551] parsed scheme: ""
W0315 02:16:19.329] I0315 02:16:19.328320   55607 clientconn.go:557] scheme "" not registered, fallback to default scheme
W0315 02:16:19.329] I0315 02:16:19.328452   55607 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
W0315 02:16:19.329] I0315 02:16:19.328504   55607 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 02:16:19.329] I0315 02:16:19.328927   55607 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
... skipping 88 lines ...
W0315 02:16:30.528] I0315 02:16:30.527818   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:16:30.529] I0315 02:16:30.528134   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:16:31.529] I0315 02:16:31.528536   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:16:31.529] I0315 02:16:31.528777   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 02:16:32.135] crd.sh:459: Successful get bars {{len .items}}: 0
I0315 02:16:32.331] customresourcedefinition.apiextensions.k8s.io "foos.company.com" deleted
W0315 02:16:32.432] Error from server (NotFound): namespaces "non-native-resources" not found
W0315 02:16:32.529] I0315 02:16:32.529059   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:16:32.530] I0315 02:16:32.529337   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 02:16:32.630] customresourcedefinition.apiextensions.k8s.io "bars.company.com" deleted
I0315 02:16:32.631] customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
I0315 02:16:32.656] customresourcedefinition.apiextensions.k8s.io "validfoos.company.com" deleted
I0315 02:16:32.698] +++ exit code: 0
... skipping 9 lines ...
I0315 02:16:32.958] +++ [0315 02:16:32] Testing cmd with image
I0315 02:16:33.056] Successful
I0315 02:16:33.057] message:deployment.apps/test1 created
I0315 02:16:33.057] has:deployment.apps/test1 created
I0315 02:16:33.144] deployment.extensions "test1" deleted
I0315 02:16:33.230] Successful
I0315 02:16:33.231] message:error: Invalid image name "InvalidImageName": invalid reference format
I0315 02:16:33.231] has:error: Invalid image name "InvalidImageName": invalid reference format
I0315 02:16:33.248] +++ exit code: 0
I0315 02:16:33.306] +++ [0315 02:16:33] Testing recursive resources
I0315 02:16:33.315] +++ [0315 02:16:33] Creating namespace namespace-1552616193-32729
I0315 02:16:33.390] namespace/namespace-1552616193-32729 created
I0315 02:16:33.463] Context "test" modified.
I0315 02:16:33.563] generic-resources.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 02:16:33.870] generic-resources.sh:206: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 02:16:33.873] Successful
I0315 02:16:33.873] message:pod/busybox0 created
I0315 02:16:33.873] pod/busybox1 created
I0315 02:16:33.874] error: error validating "hack/testdata/recursive/pod/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0315 02:16:33.874] has:error validating data: kind not set
I0315 02:16:33.973] generic-resources.sh:211: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 02:16:34.167] generic-resources.sh:219: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: busybox:busybox:
I0315 02:16:34.170] Successful
I0315 02:16:34.170] message:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0315 02:16:34.171] has:Object 'Kind' is missing
I0315 02:16:34.272] generic-resources.sh:226: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 02:16:34.575] generic-resources.sh:230: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
I0315 02:16:34.578] Successful
I0315 02:16:34.579] message:pod/busybox0 replaced
I0315 02:16:34.579] pod/busybox1 replaced
I0315 02:16:34.579] error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0315 02:16:34.579] has:error validating data: kind not set
I0315 02:16:34.678] generic-resources.sh:235: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 02:16:34.783] Successful
I0315 02:16:34.783] message:Name:               busybox0
I0315 02:16:34.784] Namespace:          namespace-1552616193-32729
I0315 02:16:34.784] Priority:           0
I0315 02:16:34.784] PriorityClassName:  <none>
... skipping 159 lines ...
I0315 02:16:34.805] has:Object 'Kind' is missing
I0315 02:16:34.891] generic-resources.sh:245: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 02:16:35.085] generic-resources.sh:249: Successful get pods {{range.items}}{{.metadata.annotations.annotatekey}}:{{end}}: annotatevalue:annotatevalue:
I0315 02:16:35.088] Successful
I0315 02:16:35.088] message:pod/busybox0 annotated
I0315 02:16:35.088] pod/busybox1 annotated
I0315 02:16:35.088] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0315 02:16:35.088] has:Object 'Kind' is missing
I0315 02:16:35.190] generic-resources.sh:254: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 02:16:35.509] generic-resources.sh:258: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
I0315 02:16:35.512] Successful
I0315 02:16:35.513] message:Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
I0315 02:16:35.513] pod/busybox0 configured
I0315 02:16:35.513] Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
I0315 02:16:35.513] pod/busybox1 configured
I0315 02:16:35.514] error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0315 02:16:35.514] has:error validating data: kind not set
I0315 02:16:35.608] generic-resources.sh:264: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 02:16:35.788] deployment.apps/nginx created
W0315 02:16:35.889] kubectl run --generator=deployment/apps.v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
W0315 02:16:35.889] I0315 02:16:33.048470   58898 event.go:209] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1552616192-9178", Name:"test1", UID:"5a16fcaf-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"969", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test1-848d5d4b47 to 1
W0315 02:16:35.889] I0315 02:16:33.054943   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552616192-9178", Name:"test1-848d5d4b47", UID:"5a180b99-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"970", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test1-848d5d4b47-qnnkl
W0315 02:16:35.890] I0315 02:16:33.529629   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
... skipping 55 lines ...
I0315 02:16:36.256] deployment.extensions "nginx" deleted
I0315 02:16:36.361] generic-resources.sh:280: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 02:16:36.539] generic-resources.sh:284: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 02:16:36.542] Successful
I0315 02:16:36.542] message:kubectl convert is DEPRECATED and will be removed in a future version.
I0315 02:16:36.542] In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
I0315 02:16:36.543] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0315 02:16:36.543] has:Object 'Kind' is missing
I0315 02:16:36.637] generic-resources.sh:289: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 02:16:36.728] Successful
I0315 02:16:36.729] message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0315 02:16:36.729] has:busybox0:busybox1:
I0315 02:16:36.730] Successful
I0315 02:16:36.731] message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0315 02:16:36.731] has:Object 'Kind' is missing
I0315 02:16:36.826] generic-resources.sh:298: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 02:16:36.928] pod/busybox0 labeled pod/busybox1 labeled error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0315 02:16:37.024] generic-resources.sh:303: Successful get pods {{range.items}}{{.metadata.labels.mylabel}}:{{end}}: myvalue:myvalue:
I0315 02:16:37.025] Successful
I0315 02:16:37.026] message:pod/busybox0 labeled
I0315 02:16:37.026] pod/busybox1 labeled
I0315 02:16:37.026] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0315 02:16:37.027] has:Object 'Kind' is missing
I0315 02:16:37.124] generic-resources.sh:308: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 02:16:37.218] pod/busybox0 patched pod/busybox1 patched error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
W0315 02:16:37.319] kubectl convert is DEPRECATED and will be removed in a future version.
W0315 02:16:37.319] In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
W0315 02:16:37.319] I0315 02:16:36.531181   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:16:37.319] I0315 02:16:36.531486   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:16:37.320] I0315 02:16:37.020382   58898 namespace_controller.go:171] Namespace has been deleted non-native-resources
I0315 02:16:37.420] generic-resources.sh:313: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: prom/busybox:prom/busybox:
I0315 02:16:37.420] Successful
I0315 02:16:37.421] message:pod/busybox0 patched
I0315 02:16:37.421] pod/busybox1 patched
I0315 02:16:37.421] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0315 02:16:37.421] has:Object 'Kind' is missing
I0315 02:16:37.436] generic-resources.sh:318: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 02:16:37.628] generic-resources.sh:322: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 02:16:37.631] Successful
I0315 02:16:37.631] message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
I0315 02:16:37.631] pod "busybox0" force deleted
I0315 02:16:37.631] pod "busybox1" force deleted
I0315 02:16:37.631] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I0315 02:16:37.632] has:Object 'Kind' is missing
I0315 02:16:37.730] generic-resources.sh:327: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 02:16:37.913] replicationcontroller/busybox0 created
I0315 02:16:37.921] replicationcontroller/busybox1 created
W0315 02:16:38.022] I0315 02:16:37.531804   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:16:38.022] I0315 02:16:37.532000   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:16:38.023] I0315 02:16:37.919720   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552616193-32729", Name:"busybox0", UID:"5cfe8632-46c8-11e9-91a6-0242ac110002", APIVersion:"v1", ResourceVersion:"1026", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-8htw4
W0315 02:16:38.023] error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
W0315 02:16:38.023] I0315 02:16:37.927900   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552616193-32729", Name:"busybox1", UID:"5cff7d99-46c8-11e9-91a6-0242ac110002", APIVersion:"v1", ResourceVersion:"1028", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-42j4c
I0315 02:16:38.123] generic-resources.sh:331: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 02:16:38.146] generic-resources.sh:336: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 02:16:38.248] generic-resources.sh:337: Successful get rc busybox0 {{.spec.replicas}}: 1
I0315 02:16:38.350] generic-resources.sh:338: Successful get rc busybox1 {{.spec.replicas}}: 1
I0315 02:16:38.567] generic-resources.sh:343: Successful get hpa busybox0 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
I0315 02:16:38.676] generic-resources.sh:344: Successful get hpa busybox1 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
I0315 02:16:38.679] Successful
I0315 02:16:38.680] message:horizontalpodautoscaler.autoscaling/busybox0 autoscaled
I0315 02:16:38.680] horizontalpodautoscaler.autoscaling/busybox1 autoscaled
I0315 02:16:38.680] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0315 02:16:38.680] has:Object 'Kind' is missing
I0315 02:16:38.762] horizontalpodautoscaler.autoscaling "busybox0" deleted
I0315 02:16:38.854] horizontalpodautoscaler.autoscaling "busybox1" deleted
W0315 02:16:38.955] I0315 02:16:38.532284   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:16:38.955] I0315 02:16:38.532497   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 02:16:39.056] generic-resources.sh:352: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 02:16:39.071] generic-resources.sh:353: Successful get rc busybox0 {{.spec.replicas}}: 1
I0315 02:16:39.180] generic-resources.sh:354: Successful get rc busybox1 {{.spec.replicas}}: 1
I0315 02:16:39.380] generic-resources.sh:358: Successful get service busybox0 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
I0315 02:16:39.485] generic-resources.sh:359: Successful get service busybox1 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
I0315 02:16:39.488] Successful
I0315 02:16:39.488] message:service/busybox0 exposed
I0315 02:16:39.488] service/busybox1 exposed
I0315 02:16:39.488] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0315 02:16:39.488] has:Object 'Kind' is missing
I0315 02:16:39.593] generic-resources.sh:365: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 02:16:39.695] generic-resources.sh:366: Successful get rc busybox0 {{.spec.replicas}}: 1
I0315 02:16:39.793] generic-resources.sh:367: Successful get rc busybox1 {{.spec.replicas}}: 1
I0315 02:16:40.003] generic-resources.sh:371: Successful get rc busybox0 {{.spec.replicas}}: 2
I0315 02:16:40.097] generic-resources.sh:372: Successful get rc busybox1 {{.spec.replicas}}: 2
I0315 02:16:40.100] Successful
I0315 02:16:40.100] message:replicationcontroller/busybox0 scaled
I0315 02:16:40.100] replicationcontroller/busybox1 scaled
I0315 02:16:40.100] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0315 02:16:40.101] has:Object 'Kind' is missing
I0315 02:16:40.199] generic-resources.sh:377: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I0315 02:16:40.378] generic-resources.sh:381: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 02:16:40.380] Successful
I0315 02:16:40.381] message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
I0315 02:16:40.381] replicationcontroller "busybox0" force deleted
I0315 02:16:40.381] replicationcontroller "busybox1" force deleted
I0315 02:16:40.381] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0315 02:16:40.381] has:Object 'Kind' is missing
I0315 02:16:40.467] generic-resources.sh:386: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 02:16:40.636] deployment.apps/nginx1-deployment created
I0315 02:16:40.641] deployment.apps/nginx0-deployment created
W0315 02:16:40.742] I0315 02:16:39.532790   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:16:40.742] I0315 02:16:39.533277   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:16:40.742] I0315 02:16:39.895171   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552616193-32729", Name:"busybox0", UID:"5cfe8632-46c8-11e9-91a6-0242ac110002", APIVersion:"v1", ResourceVersion:"1047", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-8fg78
W0315 02:16:40.743] I0315 02:16:39.905024   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552616193-32729", Name:"busybox1", UID:"5cff7d99-46c8-11e9-91a6-0242ac110002", APIVersion:"v1", ResourceVersion:"1051", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-9xl4v
W0315 02:16:40.743] I0315 02:16:40.533526   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:16:40.743] I0315 02:16:40.533710   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:16:40.743] error: error validating "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
W0315 02:16:40.744] I0315 02:16:40.642251   58898 event.go:209] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1552616193-32729", Name:"nginx1-deployment", UID:"5e9e0568-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1068", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx1-deployment-7c76c6cbb8 to 2
W0315 02:16:40.744] I0315 02:16:40.646981   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552616193-32729", Name:"nginx1-deployment-7c76c6cbb8", UID:"5e9edc0d-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1070", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-7c76c6cbb8-d68jm
W0315 02:16:40.744] I0315 02:16:40.647754   58898 event.go:209] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1552616193-32729", Name:"nginx0-deployment", UID:"5e9ed104-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1069", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx0-deployment-7bb85585d7 to 2
W0315 02:16:40.745] I0315 02:16:40.652230   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552616193-32729", Name:"nginx1-deployment-7c76c6cbb8", UID:"5e9edc0d-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1070", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-7c76c6cbb8-wfsrv
W0315 02:16:40.745] I0315 02:16:40.655641   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552616193-32729", Name:"nginx0-deployment-7bb85585d7", UID:"5e9faa1c-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1073", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-7bb85585d7-p255m
W0315 02:16:40.745] I0315 02:16:40.660891   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552616193-32729", Name:"nginx0-deployment-7bb85585d7", UID:"5e9faa1c-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1073", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-7bb85585d7-l2wgd
I0315 02:16:40.846] generic-resources.sh:390: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx0-deployment:nginx1-deployment:
I0315 02:16:40.885] generic-resources.sh:391: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
I0315 02:16:41.083] generic-resources.sh:395: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
I0315 02:16:41.085] Successful
I0315 02:16:41.085] message:deployment.apps/nginx1-deployment skipped rollback (current template already matches revision 1)
I0315 02:16:41.086] deployment.apps/nginx0-deployment skipped rollback (current template already matches revision 1)
I0315 02:16:41.086] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I0315 02:16:41.086] has:Object 'Kind' is missing
I0315 02:16:41.197] deployment.apps/nginx1-deployment paused
I0315 02:16:41.208] deployment.apps/nginx0-deployment paused
I0315 02:16:41.313] generic-resources.sh:402: Successful get deployment {{range.items}}{{.spec.paused}}:{{end}}: true:true:
I0315 02:16:41.315] Successful
I0315 02:16:41.316] message:unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
... skipping 4 lines ...
I0315 02:16:41.532] Successful
I0315 02:16:41.533] message:unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I0315 02:16:41.533] has:Object 'Kind' is missing
W0315 02:16:41.633] I0315 02:16:41.533969   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:16:41.633] I0315 02:16:41.534251   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:16:41.710] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0315 02:16:41.727] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I0315 02:16:41.827] Successful
I0315 02:16:41.828] message:deployment.apps/nginx1-deployment 
I0315 02:16:41.828] REVISION  CHANGE-CAUSE
I0315 02:16:41.828] 1         <none>
I0315 02:16:41.828] 
I0315 02:16:41.828] deployment.apps/nginx0-deployment 
I0315 02:16:41.828] REVISION  CHANGE-CAUSE
I0315 02:16:41.828] 1         <none>
I0315 02:16:41.828] 
I0315 02:16:41.829] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I0315 02:16:41.829] has:nginx0-deployment
I0315 02:16:41.829] Successful
I0315 02:16:41.829] message:deployment.apps/nginx1-deployment 
I0315 02:16:41.829] REVISION  CHANGE-CAUSE
I0315 02:16:41.829] 1         <none>
I0315 02:16:41.829] 
I0315 02:16:41.829] deployment.apps/nginx0-deployment 
I0315 02:16:41.829] REVISION  CHANGE-CAUSE
I0315 02:16:41.830] 1         <none>
I0315 02:16:41.830] 
I0315 02:16:41.830] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I0315 02:16:41.830] has:nginx1-deployment
I0315 02:16:41.830] Successful
I0315 02:16:41.830] message:deployment.apps/nginx1-deployment 
I0315 02:16:41.830] REVISION  CHANGE-CAUSE
I0315 02:16:41.831] 1         <none>
I0315 02:16:41.831] 
I0315 02:16:41.831] deployment.apps/nginx0-deployment 
I0315 02:16:41.831] REVISION  CHANGE-CAUSE
I0315 02:16:41.831] 1         <none>
I0315 02:16:41.831] 
I0315 02:16:41.831] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I0315 02:16:41.832] has:Object 'Kind' is missing
I0315 02:16:41.832] deployment.apps "nginx1-deployment" force deleted
I0315 02:16:41.832] deployment.apps "nginx0-deployment" force deleted
W0315 02:16:42.535] I0315 02:16:42.534568   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:16:42.535] I0315 02:16:42.534802   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 02:16:42.825] generic-resources.sh:424: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
... skipping 9 lines ...
I0315 02:16:43.191] message:no rollbacker has been implemented for "ReplicationController"
I0315 02:16:43.191] no rollbacker has been implemented for "ReplicationController"
I0315 02:16:43.192] unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0315 02:16:43.192] has:Object 'Kind' is missing
I0315 02:16:43.277] Successful
I0315 02:16:43.278] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0315 02:16:43.278] error: replicationcontrollers "busybox0" pausing is not supported
I0315 02:16:43.278] error: replicationcontrollers "busybox1" pausing is not supported
I0315 02:16:43.278] has:Object 'Kind' is missing
I0315 02:16:43.279] Successful
I0315 02:16:43.280] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0315 02:16:43.280] error: replicationcontrollers "busybox0" pausing is not supported
I0315 02:16:43.280] error: replicationcontrollers "busybox1" pausing is not supported
I0315 02:16:43.280] has:replicationcontrollers "busybox0" pausing is not supported
I0315 02:16:43.281] Successful
I0315 02:16:43.282] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0315 02:16:43.282] error: replicationcontrollers "busybox0" pausing is not supported
I0315 02:16:43.282] error: replicationcontrollers "busybox1" pausing is not supported
I0315 02:16:43.282] has:replicationcontrollers "busybox1" pausing is not supported
I0315 02:16:43.382] Successful
I0315 02:16:43.382] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0315 02:16:43.382] error: replicationcontrollers "busybox0" resuming is not supported
I0315 02:16:43.383] error: replicationcontrollers "busybox1" resuming is not supported
I0315 02:16:43.383] has:Object 'Kind' is missing
I0315 02:16:43.384] Successful
I0315 02:16:43.384] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0315 02:16:43.385] error: replicationcontrollers "busybox0" resuming is not supported
I0315 02:16:43.385] error: replicationcontrollers "busybox1" resuming is not supported
I0315 02:16:43.385] has:replicationcontrollers "busybox0" resuming is not supported
I0315 02:16:43.386] Successful
I0315 02:16:43.386] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I0315 02:16:43.387] error: replicationcontrollers "busybox0" resuming is not supported
I0315 02:16:43.387] error: replicationcontrollers "busybox1" resuming is not supported
I0315 02:16:43.387] has:replicationcontrollers "busybox0" resuming is not supported
I0315 02:16:43.457] replicationcontroller "busybox0" force deleted
I0315 02:16:43.462] replicationcontroller "busybox1" force deleted
W0315 02:16:43.563] error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
W0315 02:16:43.563] I0315 02:16:42.995160   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552616193-32729", Name:"busybox0", UID:"60054161-46c8-11e9-91a6-0242ac110002", APIVersion:"v1", ResourceVersion:"1117", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-28g75
W0315 02:16:43.563] I0315 02:16:43.000412   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552616193-32729", Name:"busybox1", UID:"600617ae-46c8-11e9-91a6-0242ac110002", APIVersion:"v1", ResourceVersion:"1119", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-cs698
W0315 02:16:43.564] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W0315 02:16:43.564] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
W0315 02:16:43.564] I0315 02:16:43.535063   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:16:43.564] I0315 02:16:43.535298   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 02:16:44.469] Recording: run_namespace_tests
I0315 02:16:44.469] Running command: run_namespace_tests
I0315 02:16:44.490] 
I0315 02:16:44.492] +++ Running case: test-cmd.run_namespace_tests 
... skipping 10 lines ...
W0315 02:16:46.537] I0315 02:16:46.536623   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:16:46.537] I0315 02:16:46.536820   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:16:47.537] I0315 02:16:47.537235   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:16:47.538] I0315 02:16:47.537490   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:16:48.538] I0315 02:16:48.537829   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:16:48.539] I0315 02:16:48.538098   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:16:48.896] E0315 02:16:48.895992   58898 resource_quota_controller.go:437] failed to sync resource monitors: couldn't start monitor for resource "extensions/v1beta1, Resource=networkpolicies": unable to monitor quota for resource "extensions/v1beta1, Resource=networkpolicies"
W0315 02:16:49.539] I0315 02:16:49.538379   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:16:49.539] I0315 02:16:49.538695   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:16:49.629] I0315 02:16:49.629313   58898 controller_utils.go:1027] Waiting for caches to sync for garbage collector controller
W0315 02:16:49.730] I0315 02:16:49.729692   58898 controller_utils.go:1034] Caches are synced for garbage collector controller
I0315 02:16:49.845] namespace/my-namespace condition met
I0315 02:16:49.926] Successful
I0315 02:16:49.926] message:Error from server (NotFound): namespaces "my-namespace" not found
I0315 02:16:49.926] has: not found
I0315 02:16:50.027] core.sh:1336: Successful get namespaces {{range.items}}{{ if eq $id_field \"other\" }}found{{end}}{{end}}:: :
I0315 02:16:50.098] namespace/other created
I0315 02:16:50.196] core.sh:1340: Successful get namespaces/other {{.metadata.name}}: other
I0315 02:16:50.293] core.sh:1344: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 02:16:50.477] pod/valid-pod created
W0315 02:16:50.578] I0315 02:16:50.539251   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:16:50.579] I0315 02:16:50.539592   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 02:16:50.679] core.sh:1348: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0315 02:16:50.692] core.sh:1350: Successful get pods -n other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0315 02:16:50.776] Successful
I0315 02:16:50.776] message:error: a resource cannot be retrieved by name across all namespaces
I0315 02:16:50.776] has:a resource cannot be retrieved by name across all namespaces
I0315 02:16:50.872] core.sh:1357: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I0315 02:16:50.961] pod "valid-pod" force deleted
I0315 02:16:51.059] core.sh:1361: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 02:16:51.139] namespace "other" deleted
W0315 02:16:51.240] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
... skipping 157 lines ...
I0315 02:17:12.006] +++ command: run_client_config_tests
I0315 02:17:12.022] +++ [0315 02:17:12] Creating namespace namespace-1552616232-19249
I0315 02:17:12.099] namespace/namespace-1552616232-19249 created
I0315 02:17:12.170] Context "test" modified.
I0315 02:17:12.179] +++ [0315 02:17:12] Testing client config
I0315 02:17:12.252] Successful
I0315 02:17:12.252] message:error: stat missing: no such file or directory
I0315 02:17:12.252] has:missing: no such file or directory
I0315 02:17:12.329] Successful
I0315 02:17:12.329] message:error: stat missing: no such file or directory
I0315 02:17:12.329] has:missing: no such file or directory
I0315 02:17:12.400] Successful
I0315 02:17:12.400] message:error: stat missing: no such file or directory
I0315 02:17:12.400] has:missing: no such file or directory
I0315 02:17:12.472] Successful
I0315 02:17:12.472] message:Error in configuration: context was not found for specified context: missing-context
I0315 02:17:12.473] has:context was not found for specified context: missing-context
I0315 02:17:12.545] Successful
I0315 02:17:12.546] message:error: no server found for cluster "missing-cluster"
I0315 02:17:12.546] has:no server found for cluster "missing-cluster"
I0315 02:17:12.618] Successful
I0315 02:17:12.618] message:error: auth info "missing-user" does not exist
I0315 02:17:12.618] has:auth info "missing-user" does not exist
W0315 02:17:12.719] I0315 02:17:12.553925   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:17:12.719] I0315 02:17:12.554143   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 02:17:12.820] Successful
I0315 02:17:12.820] message:error: Error loading config file "/tmp/newconfig.yaml": no kind "Config" is registered for version "v-1" in scheme "k8s.io/client-go/tools/clientcmd/api/latest/latest.go:50"
I0315 02:17:12.820] has:Error loading config file
I0315 02:17:12.830] Successful
I0315 02:17:12.830] message:error: stat missing-config: no such file or directory
I0315 02:17:12.830] has:no such file or directory
I0315 02:17:12.848] +++ exit code: 0
I0315 02:17:12.898] Recording: run_service_accounts_tests
I0315 02:17:12.899] Running command: run_service_accounts_tests
I0315 02:17:12.923] 
I0315 02:17:12.925] +++ Running case: test-cmd.run_service_accounts_tests 
... skipping 46 lines ...
I0315 02:17:19.829] Labels:                        run=pi
I0315 02:17:19.829] Annotations:                   <none>
I0315 02:17:19.829] Schedule:                      59 23 31 2 *
I0315 02:17:19.829] Concurrency Policy:            Allow
I0315 02:17:19.829] Suspend:                       False
I0315 02:17:19.829] Successful Job History Limit:  824641955752
I0315 02:17:19.829] Failed Job History Limit:      1
I0315 02:17:19.830] Starting Deadline Seconds:     <unset>
I0315 02:17:19.830] Selector:                      <unset>
I0315 02:17:19.830] Parallelism:                   <unset>
I0315 02:17:19.830] Completions:                   <unset>
I0315 02:17:19.830] Pod Template:
I0315 02:17:19.830]   Labels:  run=pi
... skipping 31 lines ...
I0315 02:17:20.392]                 job-name=test-job
I0315 02:17:20.392]                 run=pi
I0315 02:17:20.392] Annotations:    cronjob.kubernetes.io/instantiate: manual
I0315 02:17:20.392] Parallelism:    1
I0315 02:17:20.392] Completions:    1
I0315 02:17:20.392] Start Time:     Fri, 15 Mar 2019 02:17:20 +0000
I0315 02:17:20.392] Pods Statuses:  1 Running / 0 Succeeded / 0 Failed
I0315 02:17:20.393] Pod Template:
I0315 02:17:20.393]   Labels:  controller-uid=76244e1a-46c8-11e9-91a6-0242ac110002
I0315 02:17:20.393]            job-name=test-job
I0315 02:17:20.393]            run=pi
I0315 02:17:20.393]   Containers:
I0315 02:17:20.393]    pi:
... skipping 411 lines ...
I0315 02:17:30.365]   sessionAffinity: None
I0315 02:17:30.365]   type: ClusterIP
I0315 02:17:30.365] status:
I0315 02:17:30.365]   loadBalancer: {}
W0315 02:17:30.465] I0315 02:17:29.563491   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:17:30.466] I0315 02:17:29.563730   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:17:30.466] error: you must specify resources by --filename when --local is set.
W0315 02:17:30.466] Example resource specifications include:
W0315 02:17:30.466]    '-f rsrc.yaml'
W0315 02:17:30.466]    '--filename=rsrc.json'
W0315 02:17:30.564] I0315 02:17:30.564036   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:17:30.565] I0315 02:17:30.564265   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 02:17:30.665] core.sh:886: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend:
... skipping 124 lines ...
I0315 02:17:38.165] apps.sh:80: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0315 02:17:38.266] apps.sh:81: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
I0315 02:17:38.380] daemonset.extensions/bind rolled back
I0315 02:17:38.493] apps.sh:84: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
I0315 02:17:38.598] apps.sh:85: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I0315 02:17:38.712] Successful
I0315 02:17:38.713] message:error: unable to find specified revision 1000000 in history
I0315 02:17:38.713] has:unable to find specified revision
W0315 02:17:38.813] I0315 02:17:38.568561   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:17:38.814] I0315 02:17:38.568798   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 02:17:38.914] apps.sh:89: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
I0315 02:17:38.929] apps.sh:90: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I0315 02:17:39.041] daemonset.extensions/bind rolled back
W0315 02:17:39.145] E0315 02:17:39.079921   58898 daemon_controller.go:302] namespace-1552616256-7620/bind failed with : error storing status for daemon set &v1.DaemonSet{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"bind", GenerateName:"", Namespace:"namespace-1552616256-7620", SelfLink:"/apis/apps/v1/namespaces/namespace-1552616256-7620/daemonsets/bind", UID:"8021a3e5-46c8-11e9-91a6-0242ac110002", ResourceVersion:"1371", Generation:4, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63688213056, loc:(*time.Location)(0x6abbc40)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string{"deprecated.daemonset.template.generation":"4", "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"DaemonSet\",\"metadata\":{\"annotations\":{\"kubernetes.io/change-cause\":\"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true\"},\"labels\":{\"service\":\"bind\"},\"name\":\"bind\",\"namespace\":\"namespace-1552616256-7620\"},\"spec\":{\"selector\":{\"matchLabels\":{\"service\":\"bind\"}},\"template\":{\"metadata\":{\"labels\":{\"service\":\"bind\"}},\"spec\":{\"affinity\":{\"podAntiAffinity\":{\"requiredDuringSchedulingIgnoredDuringExecution\":[{\"labelSelector\":{\"matchExpressions\":[{\"key\":\"service\",\"operator\":\"In\",\"values\":[\"bind\"]}]},\"namespaces\":[],\"topologyKey\":\"kubernetes.io/hostname\"}]}},\"containers\":[{\"image\":\"k8s.gcr.io/pause:latest\",\"name\":\"kubernetes-pause\"},{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"app\"}]}},\"updateStrategy\":{\"rollingUpdate\":{\"maxUnavailable\":\"10%\"},\"type\":\"RollingUpdate\"}}}\n", "kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"}, OwnerReferences:[]v1.OwnerReference(nil), Initializers:(*v1.Initializers)(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"kubectl", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc000f497c0), Fields:(*v1.Fields)(0xc00264cd60)}, v1.ManagedFieldsEntry{Manager:"kubectl", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc001a30aa0), Fields:(*v1.Fields)(0xc00264ce08)}, v1.ManagedFieldsEntry{Manager:"kube-controller-manager", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc001a30f20), Fields:(*v1.Fields)(0xc00264ce38)}, v1.ManagedFieldsEntry{Manager:"kubectl", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc001a31360), Fields:(*v1.Fields)(0xc00264ce60)}}}, Spec:v1.DaemonSetSpec{Selector:(*v1.LabelSelector)(0xc001a317c0), Template:v1.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Initializers:(*v1.Initializers)(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"kubernetes-pause", Image:"k8s.gcr.io/pause:latest", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}, v1.Container{Name:"app", Image:"k8s.gcr.io/nginx:test-cmd", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc002706b88), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc001f94840), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(0xc001a31820), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration(nil), HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(nil)}}, UpdateStrategy:v1.DaemonSetUpdateStrategy{Type:"RollingUpdate", RollingUpdate:(*v1.RollingUpdateDaemonSet)(0xc00264ced0)}, MinReadySeconds:0, RevisionHistoryLimit:(*int32)(0xc002706c00)}, Status:v1.DaemonSetStatus{CurrentNumberScheduled:0, NumberMisscheduled:0, DesiredNumberScheduled:0, NumberReady:0, ObservedGeneration:3, UpdatedNumberScheduled:0, NumberAvailable:0, NumberUnavailable:0, CollisionCount:(*int32)(nil), Conditions:[]v1.DaemonSetCondition(nil)}}: Operation cannot be fulfilled on daemonsets.apps "bind": the object has been modified; please apply your changes to the latest version and try again
I0315 02:17:39.246] apps.sh:93: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
I0315 02:17:39.262] apps.sh:94: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0315 02:17:39.357] apps.sh:95: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
I0315 02:17:39.438] daemonset.apps "bind" deleted
I0315 02:17:39.462] +++ exit code: 0
I0315 02:17:39.511] Recording: run_rc_tests
... skipping 28 lines ...
I0315 02:17:40.733] Namespace:    namespace-1552616259-1673
I0315 02:17:40.733] Selector:     app=guestbook,tier=frontend
I0315 02:17:40.733] Labels:       app=guestbook
I0315 02:17:40.734]               tier=frontend
I0315 02:17:40.734] Annotations:  <none>
I0315 02:17:40.734] Replicas:     3 current / 3 desired
I0315 02:17:40.734] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 02:17:40.734] Pod Template:
I0315 02:17:40.734]   Labels:  app=guestbook
I0315 02:17:40.734]            tier=frontend
I0315 02:17:40.734]   Containers:
I0315 02:17:40.735]    php-redis:
I0315 02:17:40.735]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
I0315 02:17:40.855] Namespace:    namespace-1552616259-1673
I0315 02:17:40.855] Selector:     app=guestbook,tier=frontend
I0315 02:17:40.855] Labels:       app=guestbook
I0315 02:17:40.856]               tier=frontend
I0315 02:17:40.856] Annotations:  <none>
I0315 02:17:40.856] Replicas:     3 current / 3 desired
I0315 02:17:40.856] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 02:17:40.856] Pod Template:
I0315 02:17:40.856]   Labels:  app=guestbook
I0315 02:17:40.856]            tier=frontend
I0315 02:17:40.856]   Containers:
I0315 02:17:40.856]    php-redis:
I0315 02:17:40.857]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 18 lines ...
I0315 02:17:40.972] Namespace:    namespace-1552616259-1673
I0315 02:17:40.972] Selector:     app=guestbook,tier=frontend
I0315 02:17:40.972] Labels:       app=guestbook
I0315 02:17:40.972]               tier=frontend
I0315 02:17:40.972] Annotations:  <none>
I0315 02:17:40.972] Replicas:     3 current / 3 desired
I0315 02:17:40.973] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 02:17:40.973] Pod Template:
I0315 02:17:40.973]   Labels:  app=guestbook
I0315 02:17:40.973]            tier=frontend
I0315 02:17:40.973]   Containers:
I0315 02:17:40.973]    php-redis:
I0315 02:17:40.973]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 12 lines ...
I0315 02:17:41.092] Namespace:    namespace-1552616259-1673
I0315 02:17:41.092] Selector:     app=guestbook,tier=frontend
I0315 02:17:41.092] Labels:       app=guestbook
I0315 02:17:41.092]               tier=frontend
I0315 02:17:41.092] Annotations:  <none>
I0315 02:17:41.092] Replicas:     3 current / 3 desired
I0315 02:17:41.092] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 02:17:41.093] Pod Template:
I0315 02:17:41.093]   Labels:  app=guestbook
I0315 02:17:41.093]            tier=frontend
I0315 02:17:41.093]   Containers:
I0315 02:17:41.093]    php-redis:
I0315 02:17:41.093]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 18 lines ...
I0315 02:17:41.270] Namespace:    namespace-1552616259-1673
I0315 02:17:41.270] Selector:     app=guestbook,tier=frontend
I0315 02:17:41.271] Labels:       app=guestbook
I0315 02:17:41.271]               tier=frontend
I0315 02:17:41.271] Annotations:  <none>
I0315 02:17:41.271] Replicas:     3 current / 3 desired
I0315 02:17:41.271] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 02:17:41.271] Pod Template:
I0315 02:17:41.271]   Labels:  app=guestbook
I0315 02:17:41.271]            tier=frontend
I0315 02:17:41.272]   Containers:
I0315 02:17:41.272]    php-redis:
I0315 02:17:41.272]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
I0315 02:17:41.386] Namespace:    namespace-1552616259-1673
I0315 02:17:41.386] Selector:     app=guestbook,tier=frontend
I0315 02:17:41.386] Labels:       app=guestbook
I0315 02:17:41.386]               tier=frontend
I0315 02:17:41.387] Annotations:  <none>
I0315 02:17:41.387] Replicas:     3 current / 3 desired
I0315 02:17:41.387] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 02:17:41.387] Pod Template:
I0315 02:17:41.387]   Labels:  app=guestbook
I0315 02:17:41.387]            tier=frontend
I0315 02:17:41.387]   Containers:
I0315 02:17:41.387]    php-redis:
I0315 02:17:41.388]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
I0315 02:17:41.498] Namespace:    namespace-1552616259-1673
I0315 02:17:41.498] Selector:     app=guestbook,tier=frontend
I0315 02:17:41.499] Labels:       app=guestbook
I0315 02:17:41.499]               tier=frontend
I0315 02:17:41.499] Annotations:  <none>
I0315 02:17:41.499] Replicas:     3 current / 3 desired
I0315 02:17:41.499] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 02:17:41.499] Pod Template:
I0315 02:17:41.499]   Labels:  app=guestbook
I0315 02:17:41.499]            tier=frontend
I0315 02:17:41.499]   Containers:
I0315 02:17:41.500]    php-redis:
I0315 02:17:41.500]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 11 lines ...
I0315 02:17:41.612] Namespace:    namespace-1552616259-1673
I0315 02:17:41.613] Selector:     app=guestbook,tier=frontend
I0315 02:17:41.613] Labels:       app=guestbook
I0315 02:17:41.613]               tier=frontend
I0315 02:17:41.613] Annotations:  <none>
I0315 02:17:41.613] Replicas:     3 current / 3 desired
I0315 02:17:41.613] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 02:17:41.613] Pod Template:
I0315 02:17:41.614]   Labels:  app=guestbook
I0315 02:17:41.614]            tier=frontend
I0315 02:17:41.614]   Containers:
I0315 02:17:41.614]    php-redis:
I0315 02:17:41.614]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 24 lines ...
I0315 02:17:42.683] replicationcontroller/frontend scaled
I0315 02:17:42.781] core.sh:1095: Successful get rc frontend {{.spec.replicas}}: 2
I0315 02:17:42.859] replicationcontroller "frontend" deleted
W0315 02:17:42.960] I0315 02:17:41.569955   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:17:42.960] I0315 02:17:41.570185   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:17:42.961] I0315 02:17:41.804355   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552616259-1673", Name:"frontend", UID:"82479f3d-46c8-11e9-91a6-0242ac110002", APIVersion:"v1", ResourceVersion:"1405", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-kjr4q
W0315 02:17:42.961] error: Expected replicas to be 3, was 2
W0315 02:17:42.961] I0315 02:17:42.393674   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552616259-1673", Name:"frontend", UID:"82479f3d-46c8-11e9-91a6-0242ac110002", APIVersion:"v1", ResourceVersion:"1411", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-zd5r2
W0315 02:17:42.962] I0315 02:17:42.570421   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:17:42.962] I0315 02:17:42.570619   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:17:42.962] I0315 02:17:42.691158   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552616259-1673", Name:"frontend", UID:"82479f3d-46c8-11e9-91a6-0242ac110002", APIVersion:"v1", ResourceVersion:"1417", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-zd5r2
W0315 02:17:43.048] I0315 02:17:43.047925   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552616259-1673", Name:"redis-master", UID:"83d0342d-46c8-11e9-91a6-0242ac110002", APIVersion:"v1", ResourceVersion:"1428", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-5qv9g
I0315 02:17:43.149] replicationcontroller/redis-master created
... skipping 42 lines ...
I0315 02:17:44.960] service "expose-test-deployment" deleted
I0315 02:17:45.064] Successful
I0315 02:17:45.064] message:service/expose-test-deployment exposed
I0315 02:17:45.064] has:service/expose-test-deployment exposed
I0315 02:17:45.148] service "expose-test-deployment" deleted
I0315 02:17:45.241] Successful
I0315 02:17:45.241] message:error: couldn't retrieve selectors via --selector flag or introspection: invalid deployment: no selectors, therefore cannot be exposed
I0315 02:17:45.241] See 'kubectl expose -h' for help and examples
I0315 02:17:45.242] has:invalid deployment: no selectors
I0315 02:17:45.330] Successful
I0315 02:17:45.330] message:error: couldn't retrieve selectors via --selector flag or introspection: invalid deployment: no selectors, therefore cannot be exposed
I0315 02:17:45.331] See 'kubectl expose -h' for help and examples
I0315 02:17:45.331] has:invalid deployment: no selectors
I0315 02:17:45.499] deployment.apps/nginx-deployment created
W0315 02:17:45.599] I0315 02:17:45.503858   58898 event.go:209] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1552616259-1673", Name:"nginx-deployment", UID:"85472921-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1534", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-64bb598779 to 3
W0315 02:17:45.600] I0315 02:17:45.507662   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552616259-1673", Name:"nginx-deployment-64bb598779", UID:"8547feb2-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1535", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-64bb598779-zlst6
W0315 02:17:45.600] I0315 02:17:45.511949   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552616259-1673", Name:"nginx-deployment-64bb598779", UID:"8547feb2-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1535", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-64bb598779-v6n5d
... skipping 29 lines ...
I0315 02:17:47.564] service "frontend-3" deleted
I0315 02:17:47.571] service "frontend-4" deleted
I0315 02:17:47.578] service "frontend-5" deleted
W0315 02:17:47.679] I0315 02:17:47.572810   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:17:47.679] I0315 02:17:47.572932   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 02:17:47.780] Successful
I0315 02:17:47.780] message:error: cannot expose a Node
I0315 02:17:47.780] has:cannot expose
I0315 02:17:47.780] Successful
I0315 02:17:47.780] message:The Service "invalid-large-service-name-that-has-more-than-sixty-three-characters" is invalid: metadata.name: Invalid value: "invalid-large-service-name-that-has-more-than-sixty-three-characters": must be no more than 63 characters
I0315 02:17:47.780] has:metadata.name: Invalid value
I0315 02:17:47.874] Successful
I0315 02:17:47.874] message:service/kubernetes-serve-hostname-testing-sixty-three-characters-in-len exposed
... skipping 34 lines ...
I0315 02:17:50.395] horizontalpodautoscaler.autoscaling "frontend" deleted
W0315 02:17:50.496] I0315 02:17:49.573813   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:17:50.496] I0315 02:17:49.574008   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:17:50.496] I0315 02:17:49.769715   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552616259-1673", Name:"frontend", UID:"87d1cf5e-46c8-11e9-91a6-0242ac110002", APIVersion:"v1", ResourceVersion:"1653", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-769kc
W0315 02:17:50.497] I0315 02:17:49.774079   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552616259-1673", Name:"frontend", UID:"87d1cf5e-46c8-11e9-91a6-0242ac110002", APIVersion:"v1", ResourceVersion:"1653", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-52b9m
W0315 02:17:50.497] I0315 02:17:49.774139   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552616259-1673", Name:"frontend", UID:"87d1cf5e-46c8-11e9-91a6-0242ac110002", APIVersion:"v1", ResourceVersion:"1653", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-z2vd7
W0315 02:17:50.497] Error: required flag(s) "max" not set
W0315 02:17:50.497] 
W0315 02:17:50.497] 
W0315 02:17:50.497] Examples:
W0315 02:17:50.497]   # Auto scale a deployment "foo", with the number of pods between 2 and 10, no target CPU utilization specified so a default autoscaling policy will be used:
W0315 02:17:50.498]   kubectl autoscale deployment foo --min=2 --max=10
W0315 02:17:50.498]   
... skipping 57 lines ...
I0315 02:17:50.729]           limits:
I0315 02:17:50.729]             cpu: 300m
I0315 02:17:50.729]           requests:
I0315 02:17:50.729]             cpu: 300m
I0315 02:17:50.729]       terminationGracePeriodSeconds: 0
I0315 02:17:50.730] status: {}
W0315 02:17:50.830] Error from server (NotFound): deployments.apps "nginx-deployment-resources" not found
I0315 02:17:50.993] deployment.apps/nginx-deployment-resources created
W0315 02:17:51.094] I0315 02:17:51.000815   58898 event.go:209] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1552616259-1673", Name:"nginx-deployment-resources", UID:"888da0ed-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1674", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-695c766d58 to 3
W0315 02:17:51.095] I0315 02:17:51.005142   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552616259-1673", Name:"nginx-deployment-resources-695c766d58", UID:"888e9df6-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1675", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-695c766d58-4r6wm
W0315 02:17:51.095] I0315 02:17:51.010323   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552616259-1673", Name:"nginx-deployment-resources-695c766d58", UID:"888e9df6-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1675", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-695c766d58-mdwwf
W0315 02:17:51.095] I0315 02:17:51.012912   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552616259-1673", Name:"nginx-deployment-resources-695c766d58", UID:"888e9df6-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1675", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-695c766d58-54fd7
I0315 02:17:51.196] core.sh:1278: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment-resources:
... skipping 4 lines ...
W0315 02:17:51.529] I0315 02:17:51.439001   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552616259-1673", Name:"nginx-deployment-resources-5b7fc6dd8b", UID:"88d0b8ad-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1689", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-5b7fc6dd8b-99jxz
W0315 02:17:51.575] I0315 02:17:51.574777   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:17:51.576] I0315 02:17:51.574980   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 02:17:51.676] core.sh:1283: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 100m:
I0315 02:17:51.676] core.sh:1284: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m:
I0315 02:17:51.830] deployment.extensions/nginx-deployment-resources resource requirements updated
W0315 02:17:51.931] error: unable to find container named redis
W0315 02:17:51.932] I0315 02:17:51.855472   58898 event.go:209] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1552616259-1673", Name:"nginx-deployment-resources", UID:"888da0ed-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1698", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-resources-695c766d58 to 2
W0315 02:17:51.932] I0315 02:17:51.866933   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552616259-1673", Name:"nginx-deployment-resources-695c766d58", UID:"888e9df6-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1702", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-resources-695c766d58-4r6wm
W0315 02:17:51.933] I0315 02:17:51.875278   58898 event.go:209] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1552616259-1673", Name:"nginx-deployment-resources", UID:"888da0ed-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1701", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-6bc4567bf6 to 1
W0315 02:17:51.933] I0315 02:17:51.880124   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552616259-1673", Name:"nginx-deployment-resources-6bc4567bf6", UID:"890e4533-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1708", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-6bc4567bf6-c8lr7
I0315 02:17:52.034] core.sh:1289: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
I0315 02:17:52.060] core.sh:1290: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m:
... skipping 213 lines ...
I0315 02:17:52.579]   observedGeneration: 4
I0315 02:17:52.580]   replicas: 4
I0315 02:17:52.580]   unavailableReplicas: 4
I0315 02:17:52.580]   updatedReplicas: 1
W0315 02:17:52.680] I0315 02:17:52.575284   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:17:52.681] I0315 02:17:52.575476   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:17:52.681] error: you must specify resources by --filename when --local is set.
W0315 02:17:52.681] Example resource specifications include:
W0315 02:17:52.681]    '-f rsrc.yaml'
W0315 02:17:52.681]    '--filename=rsrc.json'
I0315 02:17:52.782] core.sh:1299: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
I0315 02:17:52.827] core.sh:1300: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 300m:
I0315 02:17:52.920] core.sh:1301: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.requests.cpu}}:{{end}}: 300m:
... skipping 46 lines ...
I0315 02:17:54.546]                 pod-template-hash=7875bf5c8b
I0315 02:17:54.546] Annotations:    deployment.kubernetes.io/desired-replicas: 1
I0315 02:17:54.547]                 deployment.kubernetes.io/max-replicas: 2
I0315 02:17:54.547]                 deployment.kubernetes.io/revision: 1
I0315 02:17:54.547] Controlled By:  Deployment/test-nginx-apps
I0315 02:17:54.547] Replicas:       1 current / 1 desired
I0315 02:17:54.547] Pods Status:    0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0315 02:17:54.547] Pod Template:
I0315 02:17:54.547]   Labels:  app=test-nginx-apps
I0315 02:17:54.547]            pod-template-hash=7875bf5c8b
I0315 02:17:54.547]   Containers:
I0315 02:17:54.547]    nginx:
I0315 02:17:54.547]     Image:        k8s.gcr.io/nginx:test-cmd
... skipping 104 lines ...
I0315 02:17:59.049] deployment.extensions/nginx rolled back
W0315 02:17:59.579] I0315 02:17:59.579207   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:17:59.580] I0315 02:17:59.579475   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 02:18:00.153] apps.sh:300: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0315 02:18:00.365] apps.sh:303: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0315 02:18:00.484] deployment.extensions/nginx rolled back
W0315 02:18:00.585] error: unable to find specified revision 1000000 in history
W0315 02:18:00.585] I0315 02:18:00.579815   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:18:00.585] I0315 02:18:00.579960   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:18:01.581] I0315 02:18:01.580269   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:18:01.581] I0315 02:18:01.580525   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 02:18:01.681] apps.sh:307: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
I0315 02:18:01.692] deployment.extensions/nginx paused
W0315 02:18:01.810] error: you cannot rollback a paused deployment; resume it first with 'kubectl rollout resume deployment/nginx' and try again
I0315 02:18:01.914] deployment.extensions/nginx resumed
I0315 02:18:02.072] deployment.extensions/nginx rolled back
I0315 02:18:02.284]     deployment.kubernetes.io/revision-history: 1,3
W0315 02:18:02.476] error: desired revision (3) is different from the running revision (5)
W0315 02:18:02.581] I0315 02:18:02.580824   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:18:02.582] I0315 02:18:02.581086   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:18:02.657] I0315 02:18:02.656514   58898 event.go:209] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1552616273-11909", Name:"nginx2", UID:"8f805642-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1920", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx2-78cb9c866 to 3
W0315 02:18:02.662] I0315 02:18:02.661448   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552616273-11909", Name:"nginx2-78cb9c866", UID:"8f81417c-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1921", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-78cb9c866-cnbqp
W0315 02:18:02.666] I0315 02:18:02.665705   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552616273-11909", Name:"nginx2-78cb9c866", UID:"8f81417c-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1921", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-78cb9c866-d2m4p
W0315 02:18:02.667] I0315 02:18:02.665894   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552616273-11909", Name:"nginx2-78cb9c866", UID:"8f81417c-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1921", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-78cb9c866-w65lp
... skipping 17 lines ...
I0315 02:18:03.713] apps.sh:337: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
I0315 02:18:03.716] apps.sh:338: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
I0315 02:18:03.910] deployment.extensions/nginx-deployment image updated
I0315 02:18:04.017] apps.sh:343: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I0315 02:18:04.113] apps.sh:344: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
I0315 02:18:04.212] deployment.apps/nginx-deployment image updated
W0315 02:18:04.312] error: unable to find container named "redis"
I0315 02:18:04.413] apps.sh:347: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
I0315 02:18:04.421] apps.sh:348: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
I0315 02:18:04.597] apps.sh:351: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
I0315 02:18:04.694] apps.sh:352: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
I0315 02:18:04.795] deployment.extensions/nginx-deployment image updated
W0315 02:18:04.895] I0315 02:18:04.581862   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
... skipping 73 lines ...
I0315 02:18:08.727] apps.sh:512: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 02:18:08.909] replicaset.apps/frontend-no-cascade created
W0315 02:18:09.010] I0315 02:18:07.583461   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:18:09.010] I0315 02:18:07.583602   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:18:09.011] I0315 02:18:07.613981   58898 event.go:209] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1552616273-11909", Name:"nginx-deployment", UID:"914962bd-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2119", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-58dbcd7c7f to 0
W0315 02:18:09.011] I0315 02:18:07.673258   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552616273-11909", Name:"nginx-deployment-58dbcd7c7f", UID:"91c1e402-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2125", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-58dbcd7c7f-qtvx7
W0315 02:18:09.012] E0315 02:18:07.717897   58898 replica_set.go:450] Sync "namespace-1552616273-11909/nginx-deployment-5b4bdf69f4" failed with Operation cannot be fulfilled on replicasets.apps "nginx-deployment-5b4bdf69f4": StorageError: invalid object, Code: 4, Key: /registry/replicasets/namespace-1552616273-11909/nginx-deployment-5b4bdf69f4, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 924719be-46c8-11e9-91a6-0242ac110002, UID in object meta: 
W0315 02:18:09.012] E0315 02:18:07.867830   58898 replica_set.go:450] Sync "namespace-1552616273-11909/nginx-deployment-58dbcd7c7f" failed with replicasets.apps "nginx-deployment-58dbcd7c7f" not found
W0315 02:18:09.012] I0315 02:18:08.444844   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552616287-7501", Name:"frontend", UID:"92f3077c-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2148", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-mxdzr
W0315 02:18:09.012] I0315 02:18:08.449229   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552616287-7501", Name:"frontend", UID:"92f3077c-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2148", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-8p4nc
W0315 02:18:09.013] I0315 02:18:08.451804   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552616287-7501", Name:"frontend", UID:"92f3077c-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2148", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-9282g
W0315 02:18:09.013] I0315 02:18:08.583858   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:18:09.013] I0315 02:18:08.584108   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:18:09.013] I0315 02:18:08.916556   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552616287-7501", Name:"frontend-no-cascade", UID:"933b5c1b-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2165", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-no-cascade-fdvxx
W0315 02:18:09.013] I0315 02:18:08.922616   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552616287-7501", Name:"frontend-no-cascade", UID:"933b5c1b-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2165", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-no-cascade-5ckpb
W0315 02:18:09.014] I0315 02:18:08.922664   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1552616287-7501", Name:"frontend-no-cascade", UID:"933b5c1b-46c8-11e9-91a6-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2165", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-no-cascade-mrg4w
I0315 02:18:09.114] apps.sh:518: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
I0315 02:18:09.114] +++ [0315 02:18:09] Deleting rs
I0315 02:18:09.120] replicaset.extensions "frontend-no-cascade" deleted
W0315 02:18:09.221] E0315 02:18:09.158342   58898 replica_set.go:450] Sync "namespace-1552616287-7501/frontend-no-cascade" failed with replicasets.apps "frontend-no-cascade" not found
I0315 02:18:09.322] apps.sh:522: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 02:18:09.348] apps.sh:524: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
I0315 02:18:09.432] pod "frontend-no-cascade-5ckpb" deleted
I0315 02:18:09.439] pod "frontend-no-cascade-fdvxx" deleted
I0315 02:18:09.446] pod "frontend-no-cascade-mrg4w" deleted
I0315 02:18:09.548] apps.sh:527: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
... skipping 5 lines ...
I0315 02:18:10.071] Namespace:    namespace-1552616287-7501
I0315 02:18:10.071] Selector:     app=guestbook,tier=frontend
I0315 02:18:10.071] Labels:       app=guestbook
I0315 02:18:10.071]               tier=frontend
I0315 02:18:10.071] Annotations:  <none>
I0315 02:18:10.071] Replicas:     3 current / 3 desired
I0315 02:18:10.071] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 02:18:10.071] Pod Template:
I0315 02:18:10.071]   Labels:  app=guestbook
I0315 02:18:10.071]            tier=frontend
I0315 02:18:10.071]   Containers:
I0315 02:18:10.072]    php-redis:
I0315 02:18:10.072]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
I0315 02:18:10.194] Namespace:    namespace-1552616287-7501
I0315 02:18:10.194] Selector:     app=guestbook,tier=frontend
I0315 02:18:10.195] Labels:       app=guestbook
I0315 02:18:10.195]               tier=frontend
I0315 02:18:10.195] Annotations:  <none>
I0315 02:18:10.195] Replicas:     3 current / 3 desired
I0315 02:18:10.195] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 02:18:10.195] Pod Template:
I0315 02:18:10.195]   Labels:  app=guestbook
I0315 02:18:10.195]            tier=frontend
I0315 02:18:10.195]   Containers:
I0315 02:18:10.195]    php-redis:
I0315 02:18:10.195]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 23 lines ...
I0315 02:18:10.399] Namespace:    namespace-1552616287-7501
I0315 02:18:10.399] Selector:     app=guestbook,tier=frontend
I0315 02:18:10.400] Labels:       app=guestbook
I0315 02:18:10.400]               tier=frontend
I0315 02:18:10.400] Annotations:  <none>
I0315 02:18:10.400] Replicas:     3 current / 3 desired
I0315 02:18:10.400] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 02:18:10.400] Pod Template:
I0315 02:18:10.400]   Labels:  app=guestbook
I0315 02:18:10.400]            tier=frontend
I0315 02:18:10.400]   Containers:
I0315 02:18:10.400]    php-redis:
I0315 02:18:10.400]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 12 lines ...
I0315 02:18:10.435] Namespace:    namespace-1552616287-7501
I0315 02:18:10.435] Selector:     app=guestbook,tier=frontend
I0315 02:18:10.435] Labels:       app=guestbook
I0315 02:18:10.435]               tier=frontend
I0315 02:18:10.436] Annotations:  <none>
I0315 02:18:10.436] Replicas:     3 current / 3 desired
I0315 02:18:10.436] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 02:18:10.436] Pod Template:
I0315 02:18:10.436]   Labels:  app=guestbook
I0315 02:18:10.436]            tier=frontend
I0315 02:18:10.436]   Containers:
I0315 02:18:10.436]    php-redis:
I0315 02:18:10.436]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 20 lines ...
I0315 02:18:10.687] Namespace:    namespace-1552616287-7501
I0315 02:18:10.687] Selector:     app=guestbook,tier=frontend
I0315 02:18:10.687] Labels:       app=guestbook
I0315 02:18:10.687]               tier=frontend
I0315 02:18:10.687] Annotations:  <none>
I0315 02:18:10.687] Replicas:     3 current / 3 desired
I0315 02:18:10.687] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 02:18:10.687] Pod Template:
I0315 02:18:10.687]   Labels:  app=guestbook
I0315 02:18:10.688]            tier=frontend
I0315 02:18:10.688]   Containers:
I0315 02:18:10.688]    php-redis:
I0315 02:18:10.688]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
I0315 02:18:10.710] Namespace:    namespace-1552616287-7501
I0315 02:18:10.710] Selector:     app=guestbook,tier=frontend
I0315 02:18:10.710] Labels:       app=guestbook
I0315 02:18:10.710]               tier=frontend
I0315 02:18:10.710] Annotations:  <none>
I0315 02:18:10.711] Replicas:     3 current / 3 desired
I0315 02:18:10.711] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 02:18:10.711] Pod Template:
I0315 02:18:10.711]   Labels:  app=guestbook
I0315 02:18:10.711]            tier=frontend
I0315 02:18:10.711]   Containers:
I0315 02:18:10.711]    php-redis:
I0315 02:18:10.711]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
I0315 02:18:10.822] Namespace:    namespace-1552616287-7501
I0315 02:18:10.823] Selector:     app=guestbook,tier=frontend
I0315 02:18:10.823] Labels:       app=guestbook
I0315 02:18:10.823]               tier=frontend
I0315 02:18:10.823] Annotations:  <none>
I0315 02:18:10.823] Replicas:     3 current / 3 desired
I0315 02:18:10.823] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 02:18:10.823] Pod Template:
I0315 02:18:10.823]   Labels:  app=guestbook
I0315 02:18:10.823]            tier=frontend
I0315 02:18:10.823]   Containers:
I0315 02:18:10.823]    php-redis:
I0315 02:18:10.823]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 11 lines ...
I0315 02:18:10.934] Namespace:    namespace-1552616287-7501
I0315 02:18:10.934] Selector:     app=guestbook,tier=frontend
I0315 02:18:10.934] Labels:       app=guestbook
I0315 02:18:10.934]               tier=frontend
I0315 02:18:10.934] Annotations:  <none>
I0315 02:18:10.935] Replicas:     3 current / 3 desired
I0315 02:18:10.935] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I0315 02:18:10.935] Pod Template:
I0315 02:18:10.935]   Labels:  app=guestbook
I0315 02:18:10.935]            tier=frontend
I0315 02:18:10.935]   Containers:
I0315 02:18:10.935]    php-redis:
I0315 02:18:10.935]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 196 lines ...
I0315 02:18:16.518] horizontalpodautoscaler.autoscaling "frontend" deleted
I0315 02:18:16.610] horizontalpodautoscaler.autoscaling/frontend autoscaled
I0315 02:18:16.713] apps.sh:647: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
I0315 02:18:16.796] horizontalpodautoscaler.autoscaling "frontend" deleted
W0315 02:18:16.897] I0315 02:18:16.587930   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:18:16.898] I0315 02:18:16.588186   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:18:16.898] Error: required flag(s) "max" not set
W0315 02:18:16.898] 
W0315 02:18:16.898] 
W0315 02:18:16.898] Examples:
W0315 02:18:16.898]   # Auto scale a deployment "foo", with the number of pods between 2 and 10, no target CPU utilization specified so a default autoscaling policy will be used:
W0315 02:18:16.898]   kubectl autoscale deployment foo --min=2 --max=10
W0315 02:18:16.899]   
... skipping 95 lines ...
I0315 02:18:20.270] apps.sh:431: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
I0315 02:18:20.370] apps.sh:432: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
I0315 02:18:20.487] statefulset.apps/nginx rolled back
I0315 02:18:20.591] apps.sh:435: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
I0315 02:18:20.691] apps.sh:436: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I0315 02:18:20.801] Successful
I0315 02:18:20.802] message:error: unable to find specified revision 1000000 in history
I0315 02:18:20.802] has:unable to find specified revision
I0315 02:18:20.898] apps.sh:440: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
I0315 02:18:20.994] apps.sh:441: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I0315 02:18:21.106] statefulset.apps/nginx rolled back
W0315 02:18:21.206] I0315 02:18:20.591297   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:18:21.207] I0315 02:18:20.591529   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
... skipping 64 lines ...
I0315 02:18:23.279] Name:         mock
I0315 02:18:23.279] Namespace:    namespace-1552616302-24160
I0315 02:18:23.279] Selector:     app=mock
I0315 02:18:23.280] Labels:       app=mock
I0315 02:18:23.280] Annotations:  <none>
I0315 02:18:23.280] Replicas:     1 current / 1 desired
I0315 02:18:23.280] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0315 02:18:23.280] Pod Template:
I0315 02:18:23.280]   Labels:  app=mock
I0315 02:18:23.280]   Containers:
I0315 02:18:23.280]    mock-container:
I0315 02:18:23.280]     Image:        k8s.gcr.io/pause:2.0
I0315 02:18:23.281]     Port:         9949/TCP
... skipping 62 lines ...
I0315 02:18:25.697] Name:         mock
I0315 02:18:25.697] Namespace:    namespace-1552616302-24160
I0315 02:18:25.697] Selector:     app=mock
I0315 02:18:25.697] Labels:       app=mock
I0315 02:18:25.697] Annotations:  <none>
I0315 02:18:25.697] Replicas:     1 current / 1 desired
I0315 02:18:25.697] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0315 02:18:25.697] Pod Template:
I0315 02:18:25.698]   Labels:  app=mock
I0315 02:18:25.698]   Containers:
I0315 02:18:25.698]    mock-container:
I0315 02:18:25.698]     Image:        k8s.gcr.io/pause:2.0
I0315 02:18:25.698]     Port:         9949/TCP
... skipping 60 lines ...
I0315 02:18:27.988] Name:         mock
I0315 02:18:27.989] Namespace:    namespace-1552616302-24160
I0315 02:18:27.989] Selector:     app=mock
I0315 02:18:27.989] Labels:       app=mock
I0315 02:18:27.989] Annotations:  <none>
I0315 02:18:27.989] Replicas:     1 current / 1 desired
I0315 02:18:27.989] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0315 02:18:27.989] Pod Template:
I0315 02:18:27.989]   Labels:  app=mock
I0315 02:18:27.989]   Containers:
I0315 02:18:27.990]    mock-container:
I0315 02:18:27.990]     Image:        k8s.gcr.io/pause:2.0
I0315 02:18:27.990]     Port:         9949/TCP
... skipping 46 lines ...
I0315 02:18:30.322] Namespace:    namespace-1552616302-24160
I0315 02:18:30.323] Selector:     app=mock
I0315 02:18:30.323] Labels:       app=mock
I0315 02:18:30.323]               status=replaced
I0315 02:18:30.323] Annotations:  <none>
I0315 02:18:30.323] Replicas:     1 current / 1 desired
I0315 02:18:30.323] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0315 02:18:30.323] Pod Template:
I0315 02:18:30.323]   Labels:  app=mock
I0315 02:18:30.323]   Containers:
I0315 02:18:30.323]    mock-container:
I0315 02:18:30.323]     Image:        k8s.gcr.io/pause:2.0
I0315 02:18:30.323]     Port:         9949/TCP
... skipping 11 lines ...
I0315 02:18:30.333] Namespace:    namespace-1552616302-24160
I0315 02:18:30.333] Selector:     app=mock2
I0315 02:18:30.333] Labels:       app=mock2
I0315 02:18:30.333]               status=replaced
I0315 02:18:30.333] Annotations:  <none>
I0315 02:18:30.333] Replicas:     1 current / 1 desired
I0315 02:18:30.333] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I0315 02:18:30.333] Pod Template:
I0315 02:18:30.333]   Labels:  app=mock2
I0315 02:18:30.334]   Containers:
I0315 02:18:30.334]    mock-container:
I0315 02:18:30.334]     Image:        k8s.gcr.io/pause:2.0
I0315 02:18:30.334]     Port:         9949/TCP
... skipping 117 lines ...
W0315 02:18:35.929] I0315 02:18:33.599446   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:18:35.930] I0315 02:18:34.599738   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:18:35.930] I0315 02:18:34.599981   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:18:35.930] I0315 02:18:34.619623   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552616302-24160", Name:"mock", UID:"a28dc718-46c8-11e9-91a6-0242ac110002", APIVersion:"v1", ResourceVersion:"2647", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-hggkf
W0315 02:18:35.930] I0315 02:18:35.600300   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:18:35.930] I0315 02:18:35.600551   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:18:35.931] E0315 02:18:35.839305   58898 pv_protection_controller.go:116] PV pv0001 failed with : Operation cannot be fulfilled on persistentvolumes "pv0001": the object has been modified; please apply your changes to the latest version and try again
I0315 02:18:36.031] storage.sh:33: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0001:
I0315 02:18:36.031] persistentvolume "pv0001" deleted
I0315 02:18:36.211] persistentvolume/pv0002 created
I0315 02:18:36.319] storage.sh:36: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0002:
I0315 02:18:36.396] persistentvolume "pv0002" deleted
I0315 02:18:36.583] persistentvolume/pv0003 created
W0315 02:18:36.684] E0315 02:18:36.587948   58898 pv_protection_controller.go:116] PV pv0003 failed with : Operation cannot be fulfilled on persistentvolumes "pv0003": the object has been modified; please apply your changes to the latest version and try again
W0315 02:18:36.684] I0315 02:18:36.600882   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:18:36.685] I0315 02:18:36.601127   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 02:18:36.785] storage.sh:39: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0003:
I0315 02:18:36.785] persistentvolume "pv0003" deleted
I0315 02:18:36.883] storage.sh:42: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
I0315 02:18:36.901] +++ exit code: 0
... skipping 498 lines ...
I0315 02:18:42.539] Successful
I0315 02:18:42.539] message:yes
I0315 02:18:42.540] has:yes
W0315 02:18:42.640] I0315 02:18:42.604223   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:18:42.641] I0315 02:18:42.604995   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 02:18:42.742] Successful
I0315 02:18:42.742] message:error: --subresource can not be used with NonResourceURL
I0315 02:18:42.743] has:subresource can not be used with NonResourceURL
I0315 02:18:42.758] Successful
I0315 02:18:42.875] Successful
I0315 02:18:42.875] message:yes
I0315 02:18:42.875] 0
I0315 02:18:42.876] has:0
... skipping 18 lines ...
W0315 02:18:43.271] 		{Verbs:[get list watch] APIGroups:[] Resources:[configmaps] ResourceNames:[] NonResourceURLs:[]}
I0315 02:18:43.372] legacy-script.sh:769: Successful get rolebindings -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-RB:
I0315 02:18:43.424] legacy-script.sh:770: Successful get roles -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-R:
I0315 02:18:43.555] legacy-script.sh:771: Successful get clusterrolebindings -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CRB:
I0315 02:18:43.684] legacy-script.sh:772: Successful get clusterroles -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CR:
I0315 02:18:43.795] Successful
I0315 02:18:43.796] message:error: only rbac.authorization.k8s.io/v1 is supported: not *v1beta1.ClusterRole
I0315 02:18:43.796] has:only rbac.authorization.k8s.io/v1 is supported
W0315 02:18:43.897] I0315 02:18:43.605382   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:18:43.897] I0315 02:18:43.605763   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
I0315 02:18:43.998] rolebinding.rbac.authorization.k8s.io "testing-RB" deleted
I0315 02:18:43.998] role.rbac.authorization.k8s.io "testing-R" deleted
I0315 02:18:43.998] clusterrole.rbac.authorization.k8s.io "testing-CR" deleted
... skipping 36 lines ...
I0315 02:18:45.317] +++ Running case: test-cmd.run_kubectl_explain_tests 
I0315 02:18:45.319] +++ working dir: /go/src/k8s.io/kubernetes
I0315 02:18:45.321] +++ command: run_kubectl_explain_tests
I0315 02:18:45.330] +++ [0315 02:18:45] Testing kubectl(v1:explain)
W0315 02:18:45.430] I0315 02:18:45.207592   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552616324-15391", Name:"cassandra", UID:"a89efd6a-46c8-11e9-91a6-0242ac110002", APIVersion:"v1", ResourceVersion:"2730", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-gw6r9
W0315 02:18:45.431] I0315 02:18:45.227002   58898 event.go:209] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1552616324-15391", Name:"cassandra", UID:"a89efd6a-46c8-11e9-91a6-0242ac110002", APIVersion:"v1", ResourceVersion:"2730", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-6b87v
W0315 02:18:45.431] E0315 02:18:45.232160   58898 replica_set.go:450] Sync "namespace-1552616324-15391/cassandra" failed with replicationcontrollers "cassandra" not found
I0315 02:18:45.532] KIND:     Pod
I0315 02:18:45.532] VERSION:  v1
I0315 02:18:45.532] 
I0315 02:18:45.532] DESCRIPTION:
I0315 02:18:45.532]      Pod is a collection of containers that can run on a host. This resource is
I0315 02:18:45.532]      created by clients and scheduled onto hosts.
... skipping 1157 lines ...
I0315 02:19:14.378] message:node/127.0.0.1 already uncordoned (dry run)
I0315 02:19:14.379] has:already uncordoned
I0315 02:19:14.461] node-management.sh:119: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
I0315 02:19:14.535] node/127.0.0.1 labeled
I0315 02:19:14.624] node-management.sh:124: Successful get nodes 127.0.0.1 {{.metadata.labels.test}}: label
I0315 02:19:14.690] Successful
I0315 02:19:14.690] message:error: cannot specify both a node name and a --selector option
I0315 02:19:14.691] See 'kubectl drain -h' for help and examples
I0315 02:19:14.691] has:cannot specify both a node name
I0315 02:19:14.754] Successful
I0315 02:19:14.754] message:error: USAGE: cordon NODE [flags]
I0315 02:19:14.754] See 'kubectl cordon -h' for help and examples
I0315 02:19:14.754] has:error\: USAGE\: cordon NODE
I0315 02:19:14.822] node/127.0.0.1 already uncordoned
I0315 02:19:14.893] Successful
I0315 02:19:14.894] message:error: You must provide one or more resources by argument or filename.
I0315 02:19:14.894] Example resource specifications include:
I0315 02:19:14.894]    '-f rsrc.yaml'
I0315 02:19:14.894]    '--filename=rsrc.json'
I0315 02:19:14.894]    '<resource> <name>'
I0315 02:19:14.894]    '<resource>'
I0315 02:19:14.894] has:must provide one or more resources
... skipping 15 lines ...
I0315 02:19:15.297] Successful
I0315 02:19:15.297] message:The following compatible plugins are available:
I0315 02:19:15.297] 
I0315 02:19:15.297] test/fixtures/pkg/kubectl/plugins/version/kubectl-version
I0315 02:19:15.297]   - warning: kubectl-version overwrites existing command: "kubectl version"
I0315 02:19:15.297] 
I0315 02:19:15.297] error: one plugin warning was found
I0315 02:19:15.298] has:kubectl-version overwrites existing command: "kubectl version"
I0315 02:19:15.364] Successful
I0315 02:19:15.364] message:The following compatible plugins are available:
I0315 02:19:15.364] 
I0315 02:19:15.364] test/fixtures/pkg/kubectl/plugins/kubectl-foo
I0315 02:19:15.365] test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo
I0315 02:19:15.365]   - warning: test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin: test/fixtures/pkg/kubectl/plugins/kubectl-foo
I0315 02:19:15.365] 
I0315 02:19:15.365] error: one plugin warning was found
I0315 02:19:15.365] has:test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin
I0315 02:19:15.434] Successful
I0315 02:19:15.434] message:The following compatible plugins are available:
I0315 02:19:15.434] 
I0315 02:19:15.434] test/fixtures/pkg/kubectl/plugins/kubectl-foo
I0315 02:19:15.434] has:plugins are available
I0315 02:19:15.500] Successful
I0315 02:19:15.500] message:
I0315 02:19:15.500] error: unable to find any kubectl plugins in your PATH
I0315 02:19:15.500] has:unable to find any kubectl plugins in your PATH
I0315 02:19:15.564] Successful
I0315 02:19:15.564] message:I am plugin foo
I0315 02:19:15.564] has:plugin foo
I0315 02:19:15.631] Successful
I0315 02:19:15.631] message:Client Version: version.Info{Major:"1", Minor:"15+", GitVersion:"v1.15.0-alpha.0.1224+a2313b874e222b", GitCommit:"a2313b874e222b2a33fe704fa54470dc0a94ec2a", GitTreeState:"clean", BuildDate:"2019-03-15T02:12:20Z", GoVersion:"go1.11.5", Compiler:"gc", Platform:"linux/amd64"}
... skipping 9 lines ...
I0315 02:19:15.709] 
I0315 02:19:15.712] +++ Running case: test-cmd.run_impersonation_tests 
I0315 02:19:15.714] +++ working dir: /go/src/k8s.io/kubernetes
I0315 02:19:15.716] +++ command: run_impersonation_tests
I0315 02:19:15.724] +++ [0315 02:19:15] Testing impersonation
I0315 02:19:15.790] Successful
I0315 02:19:15.791] message:error: requesting groups or user-extra for  without impersonating a user
I0315 02:19:15.791] has:without impersonating a user
W0315 02:19:15.891] I0315 02:19:13.626314   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:19:15.892] I0315 02:19:13.626523   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:19:15.892] I0315 02:19:14.626769   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
W0315 02:19:15.892] I0315 02:19:14.626938   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000002
W0315 02:19:15.892] I0315 02:19:15.627275   55607 controller.go:102] OpenAPI AggregationController: Processing item k8s_internal_local_delegation_chain_0000000001
... skipping 60 lines ...
W0315 02:19:19.463] I0315 02:19:19.459693   55607 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 02:19:19.463] I0315 02:19:19.459699   55607 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 02:19:19.463] I0315 02:19:19.459708   55607 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 02:19:19.463] I0315 02:19:19.459711   55607 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 02:19:19.463] I0315 02:19:19.459728   55607 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 02:19:19.464] I0315 02:19:19.459716   55607 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 02:19:19.464] W0315 02:19:19.459804   55607 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 02:19:19.464] I0315 02:19:19.459883   55607 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: []
W0315 02:19:19.464] I0315 02:19:19.459907   55607 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 02:19:19.464] I0315 02:19:19.459987   55607 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 02:19:19.465] I0315 02:19:19.459997   55607 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 02:19:19.465] I0315 02:19:19.460010   55607 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 02:19:19.465] I0315 02:19:19.460001   55607 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W0315 02:19:19.465] E0315 02:19:19.460019   55607 controller.go:179] rpc error: code = Unavailable desc = transport is closing
W0315 02:19:19.465] W0315 02:19:19.460083   55607 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0315 02:19:19.529] + make test-integration
I0315 02:19:19.629] No resources found
I0315 02:19:19.630] +++ [0315 02:19:19] TESTS PASSED
I0315 02:19:19.630] junit report dir: /workspace/artifacts
I0315 02:19:19.630] +++ [0315 02:19:19] Clean up complete
I0315 02:19:24.026] +++ [0315 02:19:24] Checking etcd is on PATH
... skipping 41 lines ...
I0315 02:32:50.458] ok  	k8s.io/kubernetes/test/integration/serving	51.089s
I0315 02:32:50.458] ok  	k8s.io/kubernetes/test/integration/statefulset	14.003s
I0315 02:32:50.458] ok  	k8s.io/kubernetes/test/integration/storageclasses	5.388s
I0315 02:32:50.458] ok  	k8s.io/kubernetes/test/integration/tls	7.719s
I0315 02:32:50.458] ok  	k8s.io/kubernetes/test/integration/ttlcontroller	11.373s
I0315 02:32:50.458] ok  	k8s.io/kubernetes/test/integration/volume	96.059s
I0315 02:32:50.458] FAIL	k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/test/integration	144.288s
I0315 02:33:05.389] +++ [0315 02:33:05] Saved JUnit XML test report to /workspace/artifacts/junit_d431ed5f68ae4ddf888439fb96b687a923412204_20190315-021929.xml
I0315 02:33:05.393] Makefile:184: recipe for target 'test' failed
I0315 02:33:05.406] +++ [0315 02:33:05] Cleaning up etcd
W0315 02:33:05.507] make[1]: *** [test] Error 1
W0315 02:33:05.507] !!! [0315 02:33:05] Call tree:
W0315 02:33:05.507] !!! [0315 02:33:05]  1: hack/make-rules/test-integration.sh:99 runTests(...)
I0315 02:33:05.720] +++ [0315 02:33:05] Integration test cleanup complete
I0315 02:33:05.721] Makefile:203: recipe for target 'test-integration' failed
W0315 02:33:05.822] make: *** [test-integration] Error 1
W0315 02:33:08.521] Traceback (most recent call last):
W0315 02:33:08.521]   File "/workspace/./test-infra/jenkins/../scenarios/kubernetes_verify.py", line 178, in <module>
W0315 02:33:08.521]     ARGS.exclude_typecheck, ARGS.exclude_godep)
W0315 02:33:08.521]   File "/workspace/./test-infra/jenkins/../scenarios/kubernetes_verify.py", line 140, in main
W0315 02:33:08.521]     check(*cmd)
W0315 02:33:08.521]   File "/workspace/./test-infra/jenkins/../scenarios/kubernetes_verify.py", line 48, in check
W0315 02:33:08.522]     subprocess.check_call(cmd)
W0315 02:33:08.522]   File "/usr/lib/python2.7/subprocess.py", line 186, in check_call
W0315 02:33:08.543]     raise CalledProcessError(retcode, cmd)
W0315 02:33:08.544] subprocess.CalledProcessError: Command '('docker', 'run', '--rm=true', '--privileged=true', '-v', '/var/run/docker.sock:/var/run/docker.sock', '-v', '/etc/localtime:/etc/localtime:ro', '-v', '/workspace/k8s.io/kubernetes:/go/src/k8s.io/kubernetes', '-v', '/workspace/k8s.io/:/workspace/k8s.io/', '-v', '/workspace/_artifacts:/workspace/artifacts', '-e', 'KUBE_FORCE_VERIFY_CHECKS=n', '-e', 'KUBE_VERIFY_GIT_BRANCH=master', '-e', 'EXCLUDE_TYPECHECK=n', '-e', 'EXCLUDE_GODEP=n', '-e', 'REPO_DIR=/workspace/k8s.io/kubernetes', '--tmpfs', '/tmp:exec,mode=1777', 'gcr.io/k8s-testimages/kubekins-test:1.13-v20190125-cc5d6ecff3', 'bash', '-c', 'cd kubernetes && ./hack/jenkins/test-dockerized.sh')' returned non-zero exit status 2
E0315 02:33:08.551] Command failed
I0315 02:33:08.551] process 697 exited with code 1 after 27.1m
E0315 02:33:08.551] FAIL: pull-kubernetes-integration
I0315 02:33:08.552] Call:  gcloud auth activate-service-account --key-file=/etc/service-account/service-account.json
W0315 02:33:09.082] Activated service account credentials for: [pr-kubekins@kubernetes-jenkins-pull.iam.gserviceaccount.com]
I0315 02:33:09.144] process 127667 exited with code 0 after 0.0m
I0315 02:33:09.145] Call:  gcloud config get-value account
I0315 02:33:09.504] process 127679 exited with code 0 after 0.0m
I0315 02:33:09.505] Will upload results to gs://kubernetes-jenkins/pr-logs using pr-kubekins@kubernetes-jenkins-pull.iam.gserviceaccount.com
I0315 02:33:09.505] Upload result and artifacts...
I0315 02:33:09.505] Gubernator results at https://gubernator.k8s.io/build/kubernetes-jenkins/pr-logs/pull/75393/pull-kubernetes-integration/48794
I0315 02:33:09.505] Call:  gsutil ls gs://kubernetes-jenkins/pr-logs/pull/75393/pull-kubernetes-integration/48794/artifacts
W0315 02:33:10.876] CommandException: One or more URLs matched no objects.
E0315 02:33:11.043] Command failed
I0315 02:33:11.043] process 127691 exited with code 1 after 0.0m
W0315 02:33:11.043] Remote dir gs://kubernetes-jenkins/pr-logs/pull/75393/pull-kubernetes-integration/48794/artifacts not exist yet
I0315 02:33:11.043] Call:  gsutil -m -q -o GSUtil:use_magicfile=True cp -r -c -z log,txt,xml /workspace/_artifacts gs://kubernetes-jenkins/pr-logs/pull/75393/pull-kubernetes-integration/48794/artifacts
I0315 02:33:16.469] process 127833 exited with code 0 after 0.1m
W0315 02:33:16.470] metadata path /workspace/_artifacts/metadata.json does not exist
W0315 02:33:16.470] metadata not found or invalid, init with empty metadata
... skipping 22 lines ...